Sample records for code comparison collaboration

  1. Overview of Recent Radiation Transport Code Comparisons for Space Applications

    NASA Astrophysics Data System (ADS)

    Townsend, Lawrence

    Recent advances in radiation transport code development for space applications have resulted in various comparisons of code predictions for a variety of scenarios and codes. Comparisons among both Monte Carlo and deterministic codes have been made and published by vari-ous groups and collaborations, including comparisons involving, but not limited to HZETRN, HETC-HEDS, FLUKA, GEANT, PHITS, and MCNPX. In this work, an overview of recent code prediction inter-comparisons, including comparisons to available experimental data, is presented and discussed, with emphases on those areas of agreement and disagreement among the various code predictions and published data.

  2. How the Geothermal Community Upped the Game for Computer Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    The Geothermal Technologies Office Code Comparison Study brought 11 research institutions together to collaborate on coupled thermal, hydrologic, geomechanical, and geochemical numerical simulators. These codes have the potential to help facilitate widespread geothermal energy development.

  3. Collaborating in the context of co-location: a grounded theory study.

    PubMed

    Wener, Pamela; Woodgate, Roberta L

    2016-03-10

    Most individuals with mental health concerns seek care from their primary care provider, who may lack comfort, knowledge, and time to provide care. Interprofessional collaboration between providers improves access to primary mental health services and increases primary care providers' comfort offering these services. Building and sustaining interprofessional relationships is foundational to collaborative practice in primary care settings. However, little is known about the relationship building process within these collaborative relationships. The purpose of this grounded theory study was to gain a theoretical understanding of the interprofessional collaborative relationship-building process to guide health care providers and leaders as they integrate mental health services into primary care settings. Forty primary and mental health care providers completed a demographic questionnaire and participated in either an individual or group interview. Interviews were audio-recorded and transcribed verbatim. Transcripts were reviewed several times and then individually coded. Codes were reviewed and similar codes were collapsed to form categories using using constant comparison. All codes and categories were discussed amongst the researchers and the final categories and core category was agreed upon using constant comparison and consensus. A four-stage developmental interprofessional collaborative relationship-building model explained the emergent core category of Collaboration in the Context of Co-location. The four stages included 1) Looking for Help, 2) Initiating Co-location, 3) Fitting-in, and 4) Growing Reciprocity. A patient-focus and communication strategies were essential processes throughout the interprofessional collaborative relationship-building process. Building interprofessional collaborative relationships amongst health care providers are essential to delivering mental health services in primary care settings. This developmental model describes the process of how these relationships are co-created and supported by the health care region. Furthermore, the model emphasizes that all providers must develop and sustain a patient-focus and communication strategies that are flexible. Applying this model, health care providers can guide the creation and sustainability of primary care interprofessional collaborative relationships. Moreover, this model may guide health care leaders and policy makers as they initiate interprofessional collaborative practice in other health care settings.

  4. Staging - SEER Registrars

    Cancer.gov

    Access tools for coding Extent of Disease 2018, plus Summary Staging Manual 2000, resources for comparison and mapping between staging systems, UICC information, and Collaborative Stage instructions and software.

  5. Software for Collaborative Engineering of Launch Rockets

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas Troy

    2003-01-01

    The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.

  6. Los Alamos and Lawrence Livermore National Laboratories Code-to-Code Comparison of Inter Lab Test Problem 1 for Asteroid Impact Hazard Mitigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Robert P.; Miller, Paul; Howley, Kirsten

    The NNSA Laboratories have entered into an interagency collaboration with the National Aeronautics and Space Administration (NASA) to explore strategies for prevention of Earth impacts by asteroids. Assessment of such strategies relies upon use of sophisticated multi-physics simulation codes. This document describes the task of verifying and cross-validating, between Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL), modeling capabilities and methods to be employed as part of the NNSA-NASA collaboration. The approach has been to develop a set of test problems and then to compare and contrast results obtained by use of a suite of codes, includingmore » MCNP, RAGE, Mercury, Ares, and Spheral. This document provides a short description of the codes, an overview of the idealized test problems, and discussion of the results for deflection by kinetic impactors and stand-off nuclear explosions.« less

  7. Using GTO-Velo to Facilitate Communication and Sharing of Simulation Results in Support of the Geothermal Technologies Office Code Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Signe K.; Purohit, Sumit; Boyd, Lauren W.

    The Geothermal Technologies Office Code Comparison Study (GTO-CCS) aims to support the DOE Geothermal Technologies Office in organizing and executing a model comparison activity. This project is directed at testing, diagnosing differences, and demonstrating modeling capabilities of a worldwide collection of numerical simulators for evaluating geothermal technologies. Teams of researchers are collaborating in this code comparison effort, and it is important to be able to share results in a forum where technical discussions can easily take place without requiring teams to travel to a common location. Pacific Northwest National Laboratory has developed an open-source, flexible framework called Velo that providesmore » a knowledge management infrastructure and tools to support modeling and simulation for a variety of types of projects in a number of scientific domains. GTO-Velo is a customized version of the Velo Framework that is being used as the collaborative tool in support of the GTO-CCS project. Velo is designed around a novel integration of a collaborative Web-based environment and a scalable enterprise Content Management System (CMS). The underlying framework provides a flexible and unstructured data storage system that allows for easy upload of files that can be in any format. Data files are organized in hierarchical folders and each folder and each file has a corresponding wiki page for metadata. The user interacts with Velo through a web browser based wiki technology, providing the benefit of familiarity and ease of use. High-level folders have been defined in GTO-Velo for the benchmark problem descriptions, descriptions of simulator/code capabilities, a project notebook, and folders for participating teams. Each team has a subfolder with write access limited only to the team members, where they can upload their simulation results. The GTO-CCS participants are charged with defining the benchmark problems for the study, and as each GTO-CCS Benchmark problem is defined, the problem creator can provide a description using a template on the metadata page corresponding to the benchmark problem folder. Project documents, references and videos of the weekly online meetings are shared via GTO-Velo. A results comparison tool allows users to plot their uploaded simulation results on the fly, along with those of other teams, to facilitate weekly discussions of the benchmark problem results being generated by the teams. GTO-Velo is an invaluable tool providing the project coordinators and team members with a framework for collaboration among geographically dispersed organizations.« less

  8. TOPLHA and ALOHA: comparison between Lower Hybrid wave coupling codes

    NASA Astrophysics Data System (ADS)

    Meneghini, Orso; Hillairet, J.; Goniche, M.; Bilato, R.; Voyer, D.; Parker, R.

    2008-11-01

    TOPLHA and ALOHA are wave coupling simulation tools for LH antennas. Both codes are able to account for realistic 3D antenna geometries and use a 1D plasma model. In the framework of a collaboration between MIT and CEA laboratories, the two codes have been extensively compared. In TOPLHA the EM problem is self consistently formulated by means of a set of multiple coupled integral equations having as domain the triangles of the meshed antenna surface. TOPLHA currently uses the FELHS code for modeling the plasma response. ALOHA instead uses a mode matching approach and its own plasma model. Comparisons have been done for several plasma scenarios on different antenna designs: an array of independent waveguides, a multi-junction antenna and a passive/active multi-junction antenna. When simulating the same geometry and plasma conditions the two codes compare remarkably well both for the reflection coefficients and for the launched spectra. The different approach of the two codes to solve the same problem strengthens the confidence in the final results.

  9. Results of comparative RBMK neutron computation using VNIIEF codes (cell computation, 3D statics, 3D kinetics). Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grebennikov, A.N.; Zhitnik, A.K.; Zvenigorodskaya, O.A.

    1995-12-31

    In conformity with the protocol of the Workshop under Contract {open_quotes}Assessment of RBMK reactor safety using modern Western Codes{close_quotes} VNIIEF performed a neutronics computation series to compare western and VNIIEF codes and assess whether VNIIEF codes are suitable for RBMK type reactor safety assessment computation. The work was carried out in close collaboration with M.I. Rozhdestvensky and L.M. Podlazov, NIKIET employees. The effort involved: (1) cell computations with the WIMS, EKRAN codes (improved modification of the LOMA code) and the S-90 code (VNIIEF Monte Carlo). Cell, polycell, burnup computation; (2) 3D computation of static states with the KORAT-3D and NEUmore » codes and comparison with results of computation with the NESTLE code (USA). The computations were performed in the geometry and using the neutron constants presented by the American party; (3) 3D computation of neutron kinetics with the KORAT-3D and NEU codes. These computations were performed in two formulations, both being developed in collaboration with NIKIET. Formulation of the first problem maximally possibly agrees with one of NESTLE problems and imitates gas bubble travel through a core. The second problem is a model of the RBMK as a whole with imitation of control and protection system controls (CPS) movement in a core.« less

  10. Definition of the Floating System for Phase IV of OC3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jonkman, J.

    Phase IV of the IEA Annex XXIII Offshore Code Comparison Collaboration (OC3) involves the modeling of an offshore floating wind turbine. This report documents the specifications of the floating system, which are needed by the OC3 participants for building aero-hydro-servo-elastic models.

  11. Definition of the Semisubmersible Floating System for Phase II of OC4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, A.; Jonkman, J.; Masciola, M.

    Phase II of the Offshore Code Comparison Collaboration Continuation (OC4) project involved modeling of a semisubmersible floating offshore wind system as shown below. This report documents the specifications of the floating system, which were needed by the OC4 participants for building aero-hydro-servo-elastic models.

  12. Subjective evaluation of next-generation video compression algorithms: a case study

    NASA Astrophysics Data System (ADS)

    De Simone, Francesca; Goldmann, Lutz; Lee, Jong-Seok; Ebrahimi, Touradj; Baroncini, Vittorio

    2010-08-01

    This paper describes the details and the results of the subjective quality evaluation performed at EPFL, as a contribution to the effort of the Joint Collaborative Team on Video Coding (JCT-VC) for the definition of the next-generation video coding standard. The performance of 27 coding technologies have been evaluated with respect to two H.264/MPEG-4 AVC anchors, considering high definition (HD) test material. The test campaign involved a total of 494 naive observers and took place over a period of four weeks. While similar tests have been conducted as part of the standardization process of previous video coding technologies, the test campaign described in this paper is by far the most extensive in the history of video coding standardization. The obtained subjective quality scores show high consistency and support an accurate comparison of the performance of the different coding solutions.

  13. The accuracy of real-time procedure coding by theatre nurses: a comparison with the central national system.

    PubMed

    Maclean, Donald; Younes, Hakim Ben; Forrest, Margaret; Towers, Hazel K

    2012-03-01

    Accurate and timely clinical data are required for clinical and organisational purposes and is especially important for patient management, audit of surgical performance and the electronic health record. The recent introduction of computerised theatre management systems has enabled real-time (point-of-care) operative procedure coding by clinical staff. However the accuracy of these data is unknown. The aim of this Scottish study was to compare the accuracy of theatre nurses' real-time coding on the local theatre management system with the central Scottish Morbidity Record (SMR01). Paired procedural codes were recorded, qualitatively graded for precision and compared (n = 1038). In this study, real-time, point-of-care coding by theatre nurses resulted in significant coding errors compared with the central SMR01 database. Improved collaboration between full-time coders and clinical staff using computerised decision support systems is suggested.

  14. Perception of "no code" and the role of the nurse.

    PubMed

    Honan, S; Helseth, C C; Bakke, J; Karpiuk, K; Krsnak, G; Torkelson, R

    1991-01-01

    CPR is now the rule rather than the exception and death is often viewed as the ultimate failure in modern medicine, rather than the final event of the natural life process (Stevens, 1986). The "No Code" concept has created a major dilemma in health care. An interagency collaborative study was conducted to ascertain the perceptions of nurses, physicians, and laypersons about this issue. This article deals primarily with the nurse's role and perceptions of the "No Code" issue. The comparison of nurses' perceptions with those of physicians and laypersons is unique to this study. Based on this research, suggestions are presented that will assist nursing educators and health care professionals in managing this complex dilemma.

  15. "What Do We Do about Student Grammar--All Those Missing -'ed's' and -'s's'?" Using Comparison and Contrast to Teach Standard English in Dialectally Diverse Classrooms

    ERIC Educational Resources Information Center

    Wheeler, Rebecca S.

    2006-01-01

    This paper explores the long and winding road to integrating linguistic approaches to vernacular dialects in the classroom. After exploring past roadblocks, the author shares vignettes and classroom practices of her collaborator, Rachel Swords, who has succeeded in bringing Contrastive Analysis and Code-switching to her second and third-grade…

  16. WEC-SIM Validation Testing Plan FY14 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruehl, Kelley Michelle

    2016-02-01

    The WEC-Sim project is currently on track, having met both the SNL and NREL FY14 Milestones, as shown in Table 1 and Table 2. This is also reflected in the Gantt chart uploaded to the WEC-Sim SharePoint site in the FY14 Q4 Deliverables folder. The work completed in FY14 includes code verification through code-to-code comparison (FY14 Q1 and Q2), preliminary code validation through comparison to experimental data (FY14 Q2 and Q3), presentation and publication of the WEC-Sim project at OMAE 2014 [1], [2], [3] and GMREC/METS 2014 [4] (FY14 Q3), WEC-Sim code development and public open-source release (FY14 Q3), andmore » development of a preliminary WEC-Sim validation test plan (FY14 Q4). This report presents the preliminary Validation Testing Plan developed in FY14 Q4. The validation test effort started in FY14 Q4 and will go on through FY15. Thus far the team has developed a device selection method, selected a device, and placed a contract with the testing facility, established several collaborations including industry contacts, and have working ideas on the testing details such as scaling, device design, and test conditions.« less

  17. The Therapeutic Collaboration in Life Design Counselling: The Case of Ryan

    ERIC Educational Resources Information Center

    do Céu Taveira, Maria; Ribeiro, Eugénia; Cardoso, Paulo; Silva, Filipa

    2017-01-01

    This study examined the therapeutic collaboration in a case of Life Design Counseling (LDC) with narrative change and positive career outcomes. The therapeutic collaboration-change model and correspondent coding system were used to intensively study the helping relationship throughout three sessions of LDC. The collaboration coding system enables…

  18. SNL/JAEA Collaborations on Sodium Fire Benchmarking.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Andrew Jordan; Denman, Matthew R; Takata, Takashi

    Two sodium spray fire experiments performed by Sandia National Laboratories (SNL) were used for a code - to - code comparison between CONTAIN - LMR and SPHINCS. Both computer codes are used for modeling sodium accidents in sodium fast reactors. The comparison between the two codes provides insights into the ability of both codes to model sodium spray fires. The SNL T3 and T4 experiments are 20 kg sodium spray fires with sodium spray temperature s of 200 deg C and 500 deg C, respe ctively. Given the relatively low sodium temperature in the SNL T3 experiment, the sodium spraymore » experienced a period of non - combustion. The vessel in the SNL T4 experiment experienced a rapid pressurization that caused of the instrumentation ports to fail during the sodium spray. Despite these unforeseen difficulties, both codes were shown in good agreement with the experiment s . The subsequent pool fire that develops from the unburned sodium spray is a significant characteristic of the T3 experiment. SPHIN CS showed better long - term agreement with the SNL T3 experiment than CONTAIN - LMR. The unexpected port failure during the SNL T4 experiment presented modelling challenges. The time at which the port failure occurred is unknown, but is believed to have occur red at about 11 seconds into the sodium spray fire. The sensitivity analysis for the SNL T4 experiment shows that with a port failure, the sodium spray fire can still maintain elevated pressures during the spray.« less

  19. Federal Logistics Information Systems. FLIS Procedures Manual. Document Identifier Code Input/Output Formats (Variable Length). Volume 9.

    DTIC Science & Technology

    1997-04-01

    DATA COLLABORATORS 0001N B NQ 8380 NUMBER OF DATA RECEIVERS 0001N B NQ 2533 AUTHORIZED ITEM IDENTIFICATION DATA COLLABORATOR CODE 0002 ,X B 03 18 TD...01 NC 8268 DATA ELEMENT TERMINATOR CODE 000iX VT 9505 TYPE OF SCREENING CODE 0001A 01 NC 8268 DATA ELEMENT TERMINATOR CODE 000iX VT 4690 OUTPUT DATA... 9505 TYPE OF SCREENING CODE 0001A 2 89 2910 REFERENCE NUMBER CATEGORY CODE (RNCC) 0001X 2 89 4780 REFERENCE NUMBER VARIATION CODE (RNVC) 0001 N 2 89

  20. Comparison/Validation Study of Lattice Boltzmann and Navier Stokes for Various Benchmark Applications: Report 1 in Discrete Nano-Scale Mechanics and Simulations Series

    DTIC Science & Technology

    2014-09-15

    solver, OpenFOAM version 2.1.‡ In particular, the incompressible laminar flow equations (Eq. 6-8) were solved in conjunction with the pressure im- plicit...central differencing and upwinding schemes, respectively. Since the OpenFOAM code is inherently transient, steady-state conditions were ob- tained...collaborative effort between Kitware and Los Alamos National Laboratory. ‡ OpenFOAM is a free, open-source computational fluid dynamics software developed

  1. Performance evaluation of the intra compression in the video coding standards

    NASA Astrophysics Data System (ADS)

    Abramowski, Andrzej

    2015-09-01

    The article presents a comparison of the Intra prediction algorithms in the current state-of-the-art video coding standards, including MJPEG 2000, VP8, VP9, H.264/AVC and H.265/HEVC. The effectiveness of techniques employed by each standard is evaluated in terms of compression efficiency and average encoding time. The compression efficiency is measured using BD-PSNR and BD-RATE metrics with H.265/HEVC results as an anchor. Tests are performed on a set of video sequences, composed of sequences gathered by Joint Collaborative Team on Video Coding during the development of the H.265/HEVC standard and 4K sequences provided by Ultra Video Group. According to results, H.265/HEVC provides significant bit-rate savings at the expense of computational complexity, while VP9 may be regarded as a compromise between the efficiency and required encoding time.

  2. Integrating Bar-Code Medication Administration Competencies in the Curriculum: Implications for Nursing Education and Interprofessional Collaboration.

    PubMed

    Angel, Vini M; Friedman, Marvin H; Friedman, Andrea L

    This article describes an innovative project involving the integration of bar-code medication administration technology competencies in the nursing curriculum through interprofessional collaboration among nursing, pharmacy, and computer science disciplines. A description of the bar-code medication administration technology project and lessons learned are presented.

  3. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  4. A multicenter collaborative approach to reducing pediatric codes outside the ICU.

    PubMed

    Hayes, Leslie W; Dobyns, Emily L; DiGiovine, Bruno; Brown, Ann-Marie; Jacobson, Sharon; Randall, Kelly H; Wathen, Beth; Richard, Heather; Schwab, Carolyn; Duncan, Kathy D; Thrasher, Jodi; Logsdon, Tina R; Hall, Matthew; Markovitz, Barry

    2012-03-01

    The Child Health Corporation of America formed a multicenter collaborative to decrease the rate of pediatric codes outside the ICU by 50%, double the days between these events, and improve the patient safety culture scores by 5 percentage points. A multidisciplinary pediatric advisory panel developed a comprehensive change package of process improvement strategies and measures for tracking progress. Learning sessions, conference calls, and data submission facilitated collaborative group learning and implementation. Twenty Child Health Corporation of America hospitals participated in this 12-month improvement project. Each hospital identified at least 1 noncritical care target unit in which to implement selected elements of the change package. Strategies to improve prevention, detection, and correction of the deteriorating patient ranged from relatively simple, foundational changes to more complex, advanced changes. Each hospital selected a broad range of change package elements for implementation using rapid-cycle methodologies. The primary outcome measure was reduction in codes per 1000 patient days. Secondary outcomes were days between codes and change in patient safety culture scores. Code rate for the collaborative did not decrease significantly (3% decrease). Twelve hospitals reported additional data after the collaborative and saw significant improvement in code rates (24% decrease). Patient safety culture scores improved by 4.5% to 8.5%. A complex process, such as patient deterioration, requires sufficient time and effort to achieve improved outcomes and create a deeply embedded culture of patient safety. The collaborative model can accelerate improvements achieved by individual institutions.

  5. Collaboration across private and public sector primary health care services: benefits, costs and policy implications.

    PubMed

    McDonald, Julie; Powell Davies, Gawaine; Jayasuriya, Rohan; Fort Harris, Mark

    2011-07-01

    Ongoing care for chronic conditions is best provided by interprofessional teams. There are challenges in achieving this where teams cross organisational boundaries. This article explores the influence of organisational factors on collaboration between private and public sector primary and community health services involved in diabetes care. It involved a case study using qualitative methods. Forty-five participants from 20 organisations were purposively recruited. Data were collected through semi-structured interviews and from content analysis of documents. Thematic analysis was used employing a two-level coding system and cross case comparisons. The patterns of collaborative patient care were influenced by a combination of factors relating to the benefits and costs of collaboration and the influence of support mechanisms. Benefits lay in achieving common or complementary health or organisational goals. Costs were incurred in bridging differences in organisational size, structure, complexity and culture. Collaboration was easier between private sector organisations than between private and public sectors. Financial incentives were not sufficient to overcome organisational barriers. To achieve more coordinated primary and community health care structural changes are also needed to better align funding mechanisms, priorities and accountabilities of the different organisations.

  6. How collaboration in therapy becomes therapeutic: the therapeutic collaboration coding system.

    PubMed

    Ribeiro, Eugénia; Ribeiro, António P; Gonçalves, Miguel M; Horvath, Adam O; Stiles, William B

    2013-09-01

    The quality and strength of the therapeutic collaboration, the core of the alliance, is reliably associated with positive therapy outcomes. The urgent challenge for clinicians and researchers is constructing a conceptual framework to integrate the dialectical work that fosters collaboration, with a model of how clients make progress in therapy. We propose a conceptual account of how collaboration in therapy becomes therapeutic. In addition, we report on the construction of a coding system - the therapeutic collaboration coding system (TCCS) - designed to analyse and track on a moment-by-moment basis the interaction between therapist and client. Preliminary evidence is presented regarding the coding system's psychometric properties. The TCCS evaluates each speaking turn and assesses whether and how therapists are working within the client's therapeutic zone of proximal development, defined as the space between the client's actual therapeutic developmental level and their potential developmental level that can be reached in collaboration with the therapist. We applied the TCCS to five cases: a good and a poor outcome case of narrative therapy, a good and a poor outcome case of cognitive-behavioural therapy, and a dropout case of narrative therapy. The TCCS offers markers that may help researchers better understand the therapeutic collaboration on a moment-to-moment basis and may help therapists better regulate the relationship. © 2012 The British Psychological Society.

  7. A Monte Carlo code for the fragmentation of polarized quarks

    NASA Astrophysics Data System (ADS)

    Kerbizi, A.; Artru, X.; Belghobsi, Z.; Bradamante, F.; Martin, A.

    2017-12-01

    We describe a Monte Carlo code for the fragmentation of polarized quarks into pseudoscalar mesons. The quark jet is generated by iteration of the splitting q → h + q‧ where q and q‧ indicate quarks and h a hadron. The splitting function describing the energy sharing between q‧ and h is calculated on the basis of the Symmetric Lund Model where the quark spin is introduced through spin matrices as foreseen in the 3 P 0 mechanism. A complex mass parameter is introduced for the parametrisation of the Collins effect. The results for the Collins analysing power and the comparison with the Collins asymmetries measured by the COMPASS collaboration are presented. For the first time preliminary results on the simulated azimuthal asymmetry due to the Boer-Mulders function are also given.

  8. Code Sharing and Collaboration: Experiences from the Scientist's Expert Assistant Project and their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Jones, Jeremy; Grosvenor, Sandy; Wolf, Karl; Li, Connie; Koratkar, Anuradha; Powers, Edward I. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing between groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for SOFIA, the SIRTF planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, defacto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA - both successes and failures - and offer some lessons learned that may promote further successes in collaboration and re-use.

  9. Code Sharing and Collaboration: Experiences From the Scientist's Expert Assistant Project and Their Relevance to the Virtual Observatory

    NASA Technical Reports Server (NTRS)

    Korathkar, Anuradha; Grosvenor, Sandy; Jones, Jeremy; Li, Connie; Mackey, Jennifer; Neher, Ken; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    In the Virtual Observatory (VO), software tools will perform the functions that have traditionally been performed by physical observatories and their instruments. These tools will not be adjuncts to VO functionality but will make up the very core of the VO. Consequently, the tradition of observatory and system independent tools serving a small user base is not valid for the VO. For the VO to succeed, we must improve software collaboration and code sharing between projects and groups. A significant goal of the Scientist's Expert Assistant (SEA) project has been promoting effective collaboration and code sharing among groups. During the past three years, the SEA project has been developing prototypes for new observation planning software tools and strategies. Initially funded by the Next Generation Space Telescope, parts of the SEA code have since been adopted by the Space Telescope Science Institute. SEA has also supplied code for the SIRTF (Space Infrared Telescope Facility) planning tools, and the JSky Open Source Java library. The potential benefits of sharing code are clear. The recipient gains functionality for considerably less cost. The provider gains additional developers working with their code. If enough users groups adopt a set of common code and tools, de facto standards can emerge (as demonstrated by the success of the FITS standard). Code sharing also raises a number of challenges related to the management of the code. In this talk, we will review our experiences with SEA--both successes and failures, and offer some lessons learned that might promote further successes in collaboration and re-use.

  10. Global Comparators Project: International Comparison of Hospital Outcomes Using Administrative Data

    PubMed Central

    Bottle, Alex; Middleton, Steven; Kalkman, Cor J; Livingston, Edward H; Aylin, Paul

    2013-01-01

    Objective. To produce comparable risk-adjusted outcome rates for an international sample of hospitals in a collaborative project to share outcomes and learning. Data Sources. Administrative data varying in scope, format, and coding systems were pooled from each participating hospital for the years 2005–2010. Study Design. Following reconciliation of the different coding systems in the various countries, in-hospital mortality, unplanned readmission within 30 days, and “prolonged” hospital stay (>75th percentile) were risk-adjusted via logistic regression. A web-based interface was created to facilitate outcomes analysis for individual medical centers and enable peer comparisons. Small groups of clinicians are now exploring the potential reasons for variations in outcomes in their specialty. Principal Findings. There were 6,737,211 inpatient records, including 214,622 in-hospital deaths. Although diagnostic coding depth varied appreciably by country, comorbidity weights were broadly comparable. U.S. hospitals generally had the lowest mortality rates, shortest stays, and highest readmission rates. Conclusions. Intercountry differences in outcomes may result from differences in the quality of care or in practice patterns driven by socio-economic factors. Carefully managed administrative data can be an effective resource for initiating dialog between hospitals within and across countries. Inclusion of important outcomes beyond hospital discharge would increase the value of these analyses. PMID:23742025

  11. Finding collaborators: toward interactive discovery tools for research network systems.

    PubMed

    Borromeo, Charles D; Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-11-04

    Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the implications of collaborator search tools for researcher workflows.

  12. Finding Collaborators: Toward Interactive Discovery Tools for Research Network Systems

    PubMed Central

    Schleyer, Titus K; Becich, Michael J; Hochheiser, Harry

    2014-01-01

    Background Research networking systems hold great promise for helping biomedical scientists identify collaborators with the expertise needed to build interdisciplinary teams. Although efforts to date have focused primarily on collecting and aggregating information, less attention has been paid to the design of end-user tools for using these collections to identify collaborators. To be effective, collaborator search tools must provide researchers with easy access to information relevant to their collaboration needs. Objective The aim was to study user requirements and preferences for research networking system collaborator search tools and to design and evaluate a functional prototype. Methods Paper prototypes exploring possible interface designs were presented to 18 participants in semistructured interviews aimed at eliciting collaborator search needs. Interview data were coded and analyzed to identify recurrent themes and related software requirements. Analysis results and elements from paper prototypes were used to design a Web-based prototype using the D3 JavaScript library and VIVO data. Preliminary usability studies asked 20 participants to use the tool and to provide feedback through semistructured interviews and completion of the System Usability Scale (SUS). Results Initial interviews identified consensus regarding several novel requirements for collaborator search tools, including chronological display of publication and research funding information, the need for conjunctive keyword searches, and tools for tracking candidate collaborators. Participant responses were positive (SUS score: mean 76.4%, SD 13.9). Opportunities for improving the interface design were identified. Conclusions Interactive, timeline-based displays that support comparison of researcher productivity in funding and publication have the potential to effectively support searching for collaborators. Further refinement and longitudinal studies may be needed to better understand the implications of collaborator search tools for researcher workflows. PMID:25370463

  13. Calibrated Blade-Element/Momentum Theory Aerodynamic Model of the MARIN Stock Wind Turbine: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goupee, A.; Kimball, R.; de Ridder, E. J.

    2015-04-02

    In this paper, a calibrated blade-element/momentum theory aerodynamic model of the MARIN stock wind turbine is developed and documented. The model is created using open-source software and calibrated to closely emulate experimental data obtained by the DeepCwind Consortium using a genetic algorithm optimization routine. The provided model will be useful for those interested in validating interested in validating floating wind turbine numerical simulators that rely on experiments utilizing the MARIN stock wind turbine—for example, the International Energy Agency Wind Task 30’s Offshore Code Comparison Collaboration Continued, with Correlation project.

  14. Modeling Laser-Driven Laboratory Astrophysics Experiments Using the CRASH Code

    NASA Astrophysics Data System (ADS)

    Grosskopf, Michael; Keiter, P.; Kuranz, C. C.; Malamud, G.; Trantham, M.; Drake, R.

    2013-06-01

    Laser-driven, laboratory astrophysics experiments can provide important insight into the physical processes relevant to astrophysical systems. The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density laboratory astrophysics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. The CRASH model has been used on many applications including: radiative shocks, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL) collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  15. Modeling Laboratory Astrophysics Experiments in the High-Energy-Density Regime Using the CRASH Radiation-Hydrodynamics Model

    NASA Astrophysics Data System (ADS)

    Grosskopf, M. J.; Drake, R. P.; Trantham, M. R.; Kuranz, C. C.; Keiter, P. A.; Rutter, E. M.; Sweeney, R. M.; Malamud, G.

    2012-10-01

    The radiation hydrodynamics code developed by the Center for Radiative Shock Hydrodynamics (CRASH) at the University of Michigan has been used to model experimental designs for high-energy-density physics campaigns on OMEGA and other high-energy laser facilities. This code is an Eulerian, block-adaptive AMR hydrodynamics code with implicit multigroup radiation transport and electron heat conduction. CRASH model results have shown good agreement with a experimental results from a variety of applications, including: radiative shock, Kelvin-Helmholtz and Rayleigh-Taylor experiments on the OMEGA laser; as well as laser-driven ablative plumes in experiments by the Astrophysical Collisionless Shocks Experiments with Lasers (ACSEL), collaboration. We report a series of results with the CRASH code in support of design work for upcoming high-energy-density physics experiments, as well as comparison between existing experimental data and simulation results. This work is funded by the Predictive Sciences Academic Alliances Program in NNSA-ASC via grant DEFC52- 08NA28616, by the NNSA-DS and SC-OFES Joint Program in High-Energy-Density Laboratory Plasmas, grant number DE-FG52-09NA29548, and by the National Laser User Facility Program, grant number DE-NA0000850.

  16. Modelling of LOCA Tests with the BISON Fuel Performance Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williamson, Richard L; Pastore, Giovanni; Novascone, Stephen Rhead

    2016-05-01

    BISON is a modern finite-element based, multidimensional nuclear fuel performance code that is under development at Idaho National Laboratory (USA). Recent advances of BISON include the extension of the code to the analysis of LWR fuel rod behaviour during loss-of-coolant accidents (LOCAs). In this work, BISON models for the phenomena relevant to LWR cladding behaviour during LOCAs are described, followed by presentation of code results for the simulation of LOCA tests. Analysed experiments include separate effects tests of cladding ballooning and burst, as well as the Halden IFA-650.2 fuel rod test. Two-dimensional modelling of the experiments is performed, and calculationsmore » are compared to available experimental data. Comparisons include cladding burst pressure and temperature in separate effects tests, as well as the evolution of fuel rod inner pressure during ballooning and time to cladding burst. Furthermore, BISON three-dimensional simulations of separate effects tests are performed, which demonstrate the capability to reproduce the effect of azimuthal temperature variations in the cladding. The work has been carried out in the frame of the collaboration between Idaho National Laboratory and Halden Reactor Project, and the IAEA Coordinated Research Project FUMAC.« less

  17. Coronal Physics and the Chandra Emission Line Project

    NASA Technical Reports Server (NTRS)

    Brickhouse, Nancy

    1999-01-01

    With the launch of the Chandra X-ray Observatory, high resolution X-ray spectroscopy of cosmic sources has begun. Early, deep observations of three stellar coronal sources will provide not only invaluable calibration data, but will also give us benchmarks for plasma spectral modeling codes. These codes are to interpret data from stellar coronae, galaxies and clusters of galaxies. supernova remnants and other astrophysical sources, but they have been called into question in recent years as problems with understanding moderate resolution ASCA and EUVE data have arisen. The Emission Line Project is a collaborative effort to improve the models, with Phase 1 being the comparison of models with observed spectra of Capella, Procyon, and HR, 1099. Goals of these comparisons are (1) to determine and verify accurate and robust diagnostics and (2) to identify and prioritize issues in fundamental spectroscopy which will require further theoretical and/or laboratory work. A critical issue in exploiting the coronal data for these purposes is to understand the extent to which common simplifying assumptions (coronal equilibrium, time-independence, negligible optical depth) apply. We will discuss recent advances in our understanding of stellar coronae in this context.

  18. Collaborating with the Disability Rights Community: Co-Writing a Code of Ethics as a Vehicle for Ethics Education

    ERIC Educational Resources Information Center

    Tarvydas, Vilia; Hartley, Michael; Jang, Yoo Jin; Johnston, Sara; Moore-Grant, Nykeisha; Walker, Quiteya; O'Hanlon, Chris; Whalen, James

    2012-01-01

    An ethics project is described that challenged students to collaborate with disability rights authorities to co-write a code of ethics for a Center of Independent Living. Experiential and reflective assignments analyzed how the construction of knowledge and language is never value-neutral, and people with disabilities need to have a voice in…

  19. The Development of the World Anti-Doping Code.

    PubMed

    Young, Richard

    2017-01-01

    This chapter addresses both the development and substance of the World Anti-Doping Code, which came into effect in 2003, as well as the subsequent Code amendments, which came into effect in 2009 and 2015. Through an extensive process of stakeholder input and collaboration, the World Anti-Doping Code has transformed the hodgepodge of inconsistent and competing pre-2003 anti-doping rules into a harmonized and effective approach to anti-doping. The Code, as amended, is now widely recognized worldwide as the gold standard in anti-doping. The World Anti-Doping Code originally went into effect on January 1, 2004. The first amendments to the Code went into effect on January 1, 2009, and the second amendments on January 1, 2015. The Code and the related international standards are the product of a long and collaborative process designed to make the fight against doping more effective through the adoption and implementation of worldwide harmonized rules and best practices. © 2017 S. Karger AG, Basel.

  20. Verification of the New FAST v8 Capabilities for the Modeling of Fixed-Bottom Offshore Wind Turbines: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barahona, B.; Jonkman, J.; Damiani, R.

    2014-12-01

    Coupled dynamic analysis has an important role in the design of offshore wind turbines because the systems are subject to complex operating conditions from the combined action of waves and wind. The aero-hydro-servo-elastic tool FAST v8 is framed in a novel modularization scheme that facilitates such analysis. Here, we present the verification of new capabilities of FAST v8 to model fixed-bottom offshore wind turbines. We analyze a series of load cases with both wind and wave loads and compare the results against those from the previous international code comparison projects-the International Energy Agency (IEA) Wind Task 23 Subtask 2 Offshoremore » Code Comparison Collaboration (OC3) and the IEA Wind Task 30 OC3 Continued (OC4) projects. The verification is performed using the NREL 5-MW reference turbine supported by monopile, tripod, and jacket substructures. The substructure structural-dynamics models are built within the new SubDyn module of FAST v8, which uses a linear finite-element beam model with Craig-Bampton dynamic system reduction. This allows the modal properties of the substructure to be synthesized and coupled to hydrodynamic loads and tower dynamics. The hydrodynamic loads are calculated using a new strip theory approach for multimember substructures in the updated HydroDyn module of FAST v8. These modules are linked to the rest of FAST through the new coupling scheme involving mapping between module-independent spatial discretizations and a numerically rigorous implicit solver. The results show that the new structural dynamics, hydrodynamics, and coupled solutions compare well to the results from the previous code comparison projects.« less

  1. Nomenclature for congenital and paediatric cardiac disease: historical perspectives and The International Pediatric and Congenital Cardiac Code.

    PubMed

    Franklin, Rodney C G; Jacobs, Jeffrey Phillip; Krogmann, Otto N; Béland, Marie J; Aiello, Vera D; Colan, Steven D; Elliott, Martin J; William Gaynor, J; Kurosawa, Hiromi; Maruszewski, Bohdan; Stellin, Giovanni; Tchervenkov, Christo I; Walters Iii, Henry L; Weinberg, Paul; Anderson, Robert H

    2008-12-01

    Clinicians working in the field of congenital and paediatric cardiology have long felt the need for a common diagnostic and therapeutic nomenclature and coding system with which to classify patients of all ages with congenital and acquired cardiac disease. A cohesive and comprehensive system of nomenclature, suitable for setting a global standard for multicentric analysis of outcomes and stratification of risk, has only recently emerged, namely, The International Paediatric and Congenital Cardiac Code. This review, will give an historical perspective on the development of systems of nomenclature in general, and specifically with respect to the diagnosis and treatment of patients with paediatric and congenital cardiac disease. Finally, current and future efforts to merge such systems into the paperless environment of the electronic health or patient record on a global scale are briefly explored. On October 6, 2000, The International Nomenclature Committee for Pediatric and Congenital Heart Disease was established. In January, 2005, the International Nomenclature Committee was constituted in Canada as The International Society for Nomenclature of Paediatric and Congenital Heart Disease. This International Society now has three working groups. The Nomenclature Working Group developed The International Paediatric and Congenital Cardiac Code and will continue to maintain, expand, update, and preserve this International Code. It will also provide ready access to the International Code for the global paediatric and congenital cardiology and cardiac surgery communities, related disciplines, the healthcare industry, and governmental agencies, both electronically and in published form. The Definitions Working Group will write definitions for the terms in the International Paediatric and Congenital Cardiac Code, building on the previously published definitions from the Nomenclature Working Group. The Archiving Working Group, also known as The Congenital Heart Archiving Research Team, will link images and videos to the International Paediatric and Congenital Cardiac Code. The images and videos will be acquired from cardiac morphologic specimens and imaging modalities such as echocardiography, angiography, computerized axial tomography and magnetic resonance imaging, as well as intraoperative images and videos. Efforts are ongoing to expand the usage of The International Paediatric and Congenital Cardiac Code to other areas of global healthcare. Collaborative efforts are underway involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the representatives of the steering group responsible for the creation of the 11th revision of the International Classification of Diseases, administered by the World Health Organisation. Similar collaborative efforts are underway involving the leadership of The International Nomenclature Committee for Pediatric and Congenital Heart Disease and the International Health Terminology Standards Development Organisation, who are the owners of the Systematized Nomenclature of Medicine or "SNOMED". The International Paediatric and Congenital Cardiac Code was created by specialists in the field to name and classify paediatric and congenital cardiac disease and its treatment. It is a comprehensive code that can be freely downloaded from the internet (http://www.IPCCC.net) and is already in use worldwide, particularly for international comparisons of outcomes. The goal of this effort is to create strategies for stratification of risk and to improve healthcare for the individual patient. The collaboration with the World Heath Organization, the International Health Terminology Standards Development Organisation, and the healthcare industry, will lead to further enhancement of the International Code, and to its more universal use.

  2. A Scandinavian experience of register collaboration: the Nordic Arthroplasty Register Association (NARA).

    PubMed

    Havelin, Leif I; Robertsson, Otto; Fenstad, Anne M; Overgaard, Søren; Garellick, Göran; Furnes, Ove

    2011-12-21

    The Nordic (Scandinavian) countries have had working arthroplasty registers for several years. However, the small numbers of inhabitants and the conformity within each country with respect to preferred prosthesis brands and techniques have limited register research. A collaboration called NARA (Nordic Arthroplasty Register Association) was started in 2007, resulting in a common database for Denmark, Norway, and Sweden with regard to hip replacements in 2008 and primary knee replacements in 2009. Finland joined the project in 2010. A code set was defined for the parameters that all registers had in common, and data were re-coded, within each national register, according to the common definitions. After de-identification of the patients, the anonymous data were merged into a common database. The first study based on this common database included 280,201 hip arthroplasties and the second, 151,814 knee arthroplasties. Kaplan-Meier and Cox multiple regression analyses, with adjustment for age, sex, and diagnosis, were used to calculate prosthesis survival, with any revision as the end point. In later studies, specific reasons for revision were also used as end points. We found differences among the countries concerning patient demographics, preferred surgical approaches, fixation methods, and prosthesis brands. Prosthesis survival was best in Sweden, where cement implant fixation was used more commonly than it was in the other countries. As the comparison of national results was one of the main initial aims of this collaboration, only parameters and data that all three registers could deliver were included in the database. Compared with each separate register, this combined register resulted in reduced numbers of parameters and details. In future collaborations of registers with a focus on comparing the performances of prostheses and articulations, we should probably include only the data needed specifically for the predetermined purposes, from registers that can deliver these data, rather than compiling all data from all registers that are willing to participate.

  3. Study on Collaborative Object Manipulation in Virtual Environment

    NASA Astrophysics Data System (ADS)

    Mayangsari, Maria Niken; Yong-Moo, Kwon

    This paper presents comparative study on network collaboration performance in different immersion. Especially, the relationship between user collaboration performance and degree of immersion provided by the system is addressed and compared based on several experiments. The user tests on our system include several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments.

  4. Peer Interaction in Three Collaborative Learning Environments

    ERIC Educational Resources Information Center

    Staarman, Judith Kleine; Krol, Karen; Meijden, Henny van der

    2005-01-01

    The aim of the study was to gain insight into the occurrence of different types of peer interaction and particularly the types of interaction beneficial for learning in different collaborative learning environments. Based on theoretical notions related to collaborative learning and peer interaction, a coding scheme was developed to analyze the…

  5. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  6. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation

    PubMed Central

    Pujar, Shashikant; O’Leary, Nuala A; Farrell, Catherine M; Mudge, Jonathan M; Wallin, Craig; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bult, Carol J; Frankish, Adam; Pruitt, Kim D

    2018-01-01

    Abstract The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. PMID:29126148

  7. Understanding the Components of Quality Improvement Collaboratives: A Systematic Literature Review

    PubMed Central

    Nadeem, Erum; Olin, S Serene; Hill, Laura Campbell; Hoagwood, Kimberly Eaton; Horwitz, Sarah McCue

    2013-01-01

    Context In response to national efforts to improve quality of care, policymakers and health care leaders have increasingly turned to quality improvement collaboratives (QICs) as an efficient approach to improving provider practices and patient outcomes through the dissemination of evidence-based practices. This article presents findings from a systematic review of the literature on QICs, focusing on the identification of common components of QICs in health care and exploring, when possible, relations between QIC components and outcomes at the patient or provider level. Methods A systematic search of five major health care databases generated 294 unique articles, twenty-four of which met our criteria for inclusion in our final analysis. These articles pertained to either randomized controlled trials or quasi-experimental studies with comparison groups, and they reported the findings from twenty different studies of QICs in health care. We coded the articles to identify the components reported for each collaborative. Findings We found fourteen crosscutting components as common ingredients in health care QICs (e.g., in-person learning sessions, phone meetings, data reporting, leadership involvement, and training in QI methods). The collaboratives reported included, on average, six to seven of these components. The most common were in-person learning sessions, plan-do-study-act (PDSA) cycles, multidisciplinary QI teams, and data collection for QI. The outcomes data from these studies indicate the greatest impact of QICs at the provider level; patient-level findings were less robust. Conclusions Reporting on specific components of the collaborative was imprecise across articles, rendering it impossible to identify active QIC ingredients linked to improved care. Although QICs appear to have some promise in improving the process of care, there is great need for further controlled research examining the core components of these collaboratives related to patient- and provider-level outcomes. PMID:23758514

  8. Pretest aerosol code comparisons for LWR aerosol containment tests LA1 and LA2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wright, A.L.; Wilson, J.H.; Arwood, P.C.

    The Light-Water-Reactor (LWR) Aerosol Containment Experiments (LACE) are being performed in Richland, Washington, at the Hanford Engineering Development Laboratory (HEDL) under the leadership of an international project board and the Electric Power Research Institute. These tests have two objectives: (1) to investigate, at large scale, the inherent aerosol retention behavior in LWR containments under simulated severe accident conditions, and (2) to provide an experimental data base for validating aerosol behavior and thermal-hydraulic computer codes. Aerosol computer-code comparison activities are being coordinated at the Oak Ridge National Laboratory. For each of the six LACE tests, ''pretest'' calculations (for code-to-code comparisons) andmore » ''posttest'' calculations (for code-to-test data comparisons) are being performed. The overall goals of the comparison effort are (1) to provide code users with experience in applying their codes to LWR accident-sequence conditions and (2) to evaluate and improve the code models.« less

  9. NATO Code of Best Practice for Command and Control Assessment (Code OTAN des meilleures pratiques pour l’evaluation du commandement et du controle)

    DTIC Science & Technology

    2004-01-01

    Based Research Inc . 1595 Spring Hill Road, Suite 250 Vienna, VA 22182-2216 Wheatleyg@je.jfcom.mil Mr. J. Wilder UNITED STATES US. Army Training...Sarin, 2000). Collaboration C2 Metrics The following collaboration metrics have evolved out of work done by Evidence Based Research, Inc . for the...Enemy, Troops, Terrain, Troops, Time, and Civil considerations OOTW Operations Other Than War PESTLE Political, Economic, Social, Technological

  10. Genomics dataset on unclassified published organism (patent US 7547531).

    PubMed

    Khan Shawan, Mohammad Mahfuz Ali; Hasan, Md Ashraful; Hossain, Md Mozammel; Hasan, Md Mahmudul; Parvin, Afroza; Akter, Salina; Uddin, Kazi Rasel; Banik, Subrata; Morshed, Mahbubul; Rahman, Md Nazibur; Rahman, S M Badier

    2016-12-01

    Nucleotide (DNA) sequence analysis provides important clues regarding the characteristics and taxonomic position of an organism. With the intention that, DNA sequence analysis is very crucial to learn about hierarchical classification of that particular organism. This dataset (patent US 7547531) is chosen to simplify all the complex raw data buried in undisclosed DNA sequences which help to open doors for new collaborations. In this data, a total of 48 unidentified DNA sequences from patent US 7547531 were selected and their complete sequences were retrieved from NCBI BioSample database. Quick response (QR) code of those DNA sequences was constructed by DNA BarID tool. QR code is useful for the identification and comparison of isolates with other organisms. AT/GC content of the DNA sequences was determined using ENDMEMO GC Content Calculator, which indicates their stability at different temperature. The highest GC content was observed in GP445188 (62.5%) which was followed by GP445198 (61.8%) and GP445189 (59.44%), while lowest was in GP445178 (24.39%). In addition, New England BioLabs (NEB) database was used to identify cleavage code indicating the 5, 3 and blunt end and enzyme code indicating the methylation site of the DNA sequences was also shown. These data will be helpful for the construction of the organisms' hierarchical classification, determination of their phylogenetic and taxonomic position and revelation of their molecular characteristics.

  11. Transfer of Real-time Dynamic Radiation Environment Assimilation Model; Research to Operation

    NASA Astrophysics Data System (ADS)

    Cho, K. S. F.; Hwang, J.; Shin, D. K.; Kim, G. J.; Morley, S.; Henderson, M. G.; Friedel, R. H.; Reeves, G. D.

    2015-12-01

    Real-time Dynamic Radiation Environment Assimilation Model (rtDREAM) was developed by LANL for nowcast of energetic electrons' flux at the radiation belt to quantify potential risks from radiation damage at the satellites. Assimilated data are from multiple sources including LANL assets (GEO, GPS). For transfer from research to operation of the rtDREAM code, LANL/KSWC/NOAA makes a Memorandum Of Understanding (MOU) on the collaboration between three parts. By this MOU, KWSC/RRA provides all the support for transitioning the research version of DREAM to operations. KASI is primarily responsible for providing all the interfaces between the current scientific output formats of the code and useful space weather products that can be used and accessed through the web. In the second phase, KASI will be responsible in performing the work needed to transform the Van Allen Probes beacon data into "DREAM ready" inputs. KASI will also provide the "operational" code framework and additional data preparation, model output, display and web page codes back to LANL and SWPC. KASI is already a NASA partnering ground station for the Van Allen Probes' space weather beacon data and can here show use and utility of these data for comparison between rtDREAM and observations by web. NOAA has offered to take on some of the data processing tasks specific to the GOES data.

  12. Consensus coding sequence (CCDS) database: a standardized set of human and mouse protein-coding regions supported by expert curation.

    PubMed

    Pujar, Shashikant; O'Leary, Nuala A; Farrell, Catherine M; Loveland, Jane E; Mudge, Jonathan M; Wallin, Craig; Girón, Carlos G; Diekhans, Mark; Barnes, If; Bennett, Ruth; Berry, Andrew E; Cox, Eric; Davidson, Claire; Goldfarb, Tamara; Gonzalez, Jose M; Hunt, Toby; Jackson, John; Joardar, Vinita; Kay, Mike P; Kodali, Vamsi K; Martin, Fergal J; McAndrews, Monica; McGarvey, Kelly M; Murphy, Michael; Rajput, Bhanu; Rangwala, Sanjida H; Riddick, Lillian D; Seal, Ruth L; Suner, Marie-Marthe; Webb, David; Zhu, Sophia; Aken, Bronwen L; Bruford, Elspeth A; Bult, Carol J; Frankish, Adam; Murphy, Terence; Pruitt, Kim D

    2018-01-04

    The Consensus Coding Sequence (CCDS) project provides a dataset of protein-coding regions that are identically annotated on the human and mouse reference genome assembly in genome annotations produced independently by NCBI and the Ensembl group at EMBL-EBI. This dataset is the product of an international collaboration that includes NCBI, Ensembl, HUGO Gene Nomenclature Committee, Mouse Genome Informatics and University of California, Santa Cruz. Identically annotated coding regions, which are generated using an automated pipeline and pass multiple quality assurance checks, are assigned a stable and tracked identifier (CCDS ID). Additionally, coordinated manual review by expert curators from the CCDS collaboration helps in maintaining the integrity and high quality of the dataset. The CCDS data are available through an interactive web page (https://www.ncbi.nlm.nih.gov/CCDS/CcdsBrowse.cgi) and an FTP site (ftp://ftp.ncbi.nlm.nih.gov/pub/CCDS/). In this paper, we outline the ongoing work, growth and stability of the CCDS dataset and provide updates on new collaboration members and new features added to the CCDS user interface. We also present expert curation scenarios, with specific examples highlighting the importance of an accurate reference genome assembly and the crucial role played by input from the research community. Published by Oxford University Press on behalf of Nucleic Acids Research 2017.

  13. Analyzing Interactions by an IIS-Map-Based Method in Face-to-Face Collaborative Learning: An Empirical Study

    ERIC Educational Resources Information Center

    Zheng, Lanqin; Yang, Kaicheng; Huang, Ronghuai

    2012-01-01

    This study proposes a new method named the IIS-map-based method for analyzing interactions in face-to-face collaborative learning settings. This analysis method is conducted in three steps: firstly, drawing an initial IIS-map according to collaborative tasks; secondly, coding and segmenting information flows into information items of IIS; thirdly,…

  14. Programming (Tips) for Physicists & Engineers

    ScienceCinema

    Ozcan, Erkcan

    2018-02-19

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  15. Programming (Tips) for Physicists & Engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ozcan, Erkcan

    2010-07-13

    Programming for today's physicists and engineers. Work environment: today's astroparticle, accelerator experiments and information industry rely on large collaborations. Need more than ever: code sharing/resuse, code building--framework integration, documentation and good visualization, working remotely, not reinventing the wheel.

  16. The ADVANCE Code of Conduct for collaborative vaccine studies.

    PubMed

    Kurz, Xavier; Bauchau, Vincent; Mahy, Patrick; Glismann, Steffen; van der Aa, Lieke Maria; Simondon, François

    2017-04-04

    Lessons learnt from the 2009 (H1N1) flu pandemic highlighted factors limiting the capacity to collect European data on vaccine exposure, safety and effectiveness, including lack of rapid access to available data sources or expertise, difficulties to establish efficient interactions between multiple parties, lack of confidence between private and public sectors, concerns about possible or actual conflicts of interest (or perceptions thereof) and inadequate funding mechanisms. The Innovative Medicines Initiative's Accelerated Development of VAccine benefit-risk Collaboration in Europe (ADVANCE) consortium was established to create an efficient and sustainable infrastructure for rapid and integrated monitoring of post-approval benefit-risk of vaccines, including a code of conduct and governance principles for collaborative studies. The development of the code of conduct was guided by three core and common values (best science, strengthening public health, transparency) and a review of existing guidance and relevant published articles. The ADVANCE Code of Conduct includes 45 recommendations in 10 topics (Scientific integrity, Scientific independence, Transparency, Conflicts of interest, Study protocol, Study report, Publication, Subject privacy, Sharing of study data, Research contract). Each topic includes a definition, a set of recommendations and a list of additional reading. The concept of the study team is introduced as a key component of the ADVANCE Code of Conduct with a core set of roles and responsibilities. It is hoped that adoption of the ADVANCE Code of Conduct by all partners involved in a study will facilitate and speed-up its initiation, design, conduct and reporting. Adoption of the ADVANCE Code of Conduct should be stated in the study protocol, study report and publications and journal editors are encouraged to use it as an indication that good principles of public health, science and transparency were followed throughout the study. Copyright © 2017. Published by Elsevier Ltd.

  17. FAST Model Calibration and Validation of the OC5-DeepCwind Floating Offshore Wind System Against Wave Tank Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Robertson, Amy N; Jonkman, Jason

    During the course of the Offshore Code Comparison Collaboration, Continued, with Correlation (OC5) project, which focused on the validation of numerical methods through comparison against tank test data, the authors created a numerical FAST model of the 1:50-scale DeepCwind semisubmersible system that was tested at the Maritime Research Institute Netherlands ocean basin in 2013. This paper discusses several model calibration studies that were conducted to identify model adjustments that improve the agreement between the numerical simulations and the experimental test data. These calibration studies cover wind-field-specific parameters (coherence, turbulence), hydrodynamic and aerodynamic modeling approaches, as well as rotor model (blade-pitchmore » and blade-mass imbalances) and tower model (structural tower damping coefficient) adjustments. These calibration studies were conducted based on relatively simple calibration load cases (wave only/wind only). The agreement between the final FAST model and experimental measurements is then assessed based on more-complex combined wind and wave validation cases.« less

  18. Collaboration and critical thinking in an online chemistry environment

    NASA Astrophysics Data System (ADS)

    Kershisnik, Elizabeth Irene

    The purpose of this dissertation was to examine collaboration and student's critical thinking and cognitive achievement within online chemistry courses. This quantitative study focused on the apparent lack of research relating collaboration and critical thinking in online science courses. Collaboration was determined using the small group collaboration model coding scheme, which examined student postings in asynchronous discussion forums for quantity, equality, and shareness. Critical thinking was measured using the chemistry concept reasoning test, the online self-diagnostic test, and also asynchronous student homework discussion postings that were coded using the community of inquiry cognitive presence indicators. Finally cognitive achievement was determined using quiz scores and the student's final grade. Even though no significant findings were revealed in this exploratory quasi-experimental study, this research did add to the educational technology knowledge base since very few studies have investigated the chemistry discipline in an online environment. Continued research in this area is vital to understanding how critical thinking progresses, how it can be assessed, and what factors in the classroom, be it virtual or face-to-face, have the greatest effect on critical thinking.

  19. CBP Toolbox Version 3.0 “Beta Testing” Performance Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, III, F. G.

    2016-07-29

    One function of the Cementitious Barriers Partnership (CBP) is to assess available models of cement degradation and to assemble suitable models into a “Toolbox” that would be made available to members of the partnership, as well as the DOE Complex. To this end, SRNL and Vanderbilt University collaborated to develop an interface using the GoldSim software to the STADIUM @ code developed by SIMCO Technologies, Inc. and LeachXS/ORCHESTRA developed by Energy research Centre of the Netherlands (ECN). Release of Version 3.0 of the CBP Toolbox is planned in the near future. As a part of this release, an increased levelmore » of quality assurance for the partner codes and the GoldSim interface has been developed. This report documents results from evaluation testing of the ability of CBP Toolbox 3.0 to perform simulations of concrete degradation applicable to performance assessment of waste disposal facilities. Simulations of the behavior of Savannah River Saltstone Vault 2 and Vault 1/4 concrete subject to sulfate attack and carbonation over a 500- to 1000-year time period were run using a new and upgraded version of the STADIUM @ code and the version of LeachXS/ORCHESTRA released in Version 2.0 of the CBP Toolbox. Running both codes allowed comparison of results from two models which take very different approaches to simulating cement degradation. In addition, simulations of chloride attack on the two concretes were made using the STADIUM @ code. The evaluation sought to demonstrate that: 1) the codes are capable of running extended realistic simulations in a reasonable amount of time; 2) the codes produce “reasonable” results; the code developers have provided validation test results as part of their code QA documentation; and 3) the two codes produce results that are consistent with one another. Results of the evaluation testing showed that the three criteria listed above were met by the CBP partner codes. Therefore, it is concluded that the codes can be used to support performance assessment. This conclusion takes into account the QA documentation produced for the partner codes and for the CBP Toolbox.« less

  20. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  1. The National Transport Code Collaboration Module Library

    NASA Astrophysics Data System (ADS)

    Kritz, A. H.; Bateman, G.; Kinsey, J.; Pankin, A.; Onjun, T.; Redd, A.; McCune, D.; Ludescher, C.; Pletzer, A.; Andre, R.; Zakharov, L.; Lodestro, L.; Pearlstein, L. D.; Jong, R.; Houlberg, W.; Strand, P.; Wiley, J.; Valanju, P.; John, H. St.; Waltz, R.; Mandrekas, J.; Mau, T. K.; Carlsson, J.; Braams, B.

    2004-12-01

    This paper reports on the progress in developing a library of code modules under the auspices of the National Transport Code Collaboration (NTCC). Code modules are high quality, fully documented software packages with a clearly defined interface. The modules provide a variety of functions, such as implementing numerical physics models; performing ancillary functions such as I/O or graphics; or providing tools for dealing with common issues in scientific programming such as portability of Fortran codes. Researchers in the plasma community submit code modules, and a review procedure is followed to insure adherence to programming and documentation standards. The review process is designed to provide added confidence with regard to the use of the modules and to allow users and independent reviews to validate the claims of the modules' authors. All modules include source code; clear instructions for compilation of binaries on a variety of target architectures; and test cases with well-documented input and output. All the NTCC modules and ancillary information, such as current standards and documentation, are available from the NTCC Module Library Website http://w3.pppl.gov/NTCC. The goal of the project is to develop a resource of value to builders of integrated modeling codes and to plasma physics researchers generally. Currently, there are more than 40 modules in the module library.

  2. Error-analysis and comparison to analytical models of numerical waveforms produced by the NRAR Collaboration

    NASA Astrophysics Data System (ADS)

    Hinder, Ian; Buonanno, Alessandra; Boyle, Michael; Etienne, Zachariah B.; Healy, James; Johnson-McDaniel, Nathan K.; Nagar, Alessandro; Nakano, Hiroyuki; Pan, Yi; Pfeiffer, Harald P.; Pürrer, Michael; Reisswig, Christian; Scheel, Mark A.; Schnetter, Erik; Sperhake, Ulrich; Szilágyi, Bela; Tichy, Wolfgang; Wardell, Barry; Zenginoğlu, Anıl; Alic, Daniela; Bernuzzi, Sebastiano; Bode, Tanja; Brügmann, Bernd; Buchman, Luisa T.; Campanelli, Manuela; Chu, Tony; Damour, Thibault; Grigsby, Jason D.; Hannam, Mark; Haas, Roland; Hemberger, Daniel A.; Husa, Sascha; Kidder, Lawrence E.; Laguna, Pablo; London, Lionel; Lovelace, Geoffrey; Lousto, Carlos O.; Marronetti, Pedro; Matzner, Richard A.; Mösta, Philipp; Mroué, Abdul; Müller, Doreen; Mundim, Bruno C.; Nerozzi, Andrea; Paschalidis, Vasileios; Pollney, Denis; Reifenberger, George; Rezzolla, Luciano; Shapiro, Stuart L.; Shoemaker, Deirdre; Taracchini, Andrea; Taylor, Nicholas W.; Teukolsky, Saul A.; Thierfelder, Marcus; Witek, Helvi; Zlochower, Yosef

    2013-01-01

    The Numerical-Relativity-Analytical-Relativity (NRAR) collaboration is a joint effort between members of the numerical relativity, analytical relativity and gravitational-wave data analysis communities. The goal of the NRAR collaboration is to produce numerical-relativity simulations of compact binaries and use them to develop accurate analytical templates for the LIGO/Virgo Collaboration to use in detecting gravitational-wave signals and extracting astrophysical information from them. We describe the results of the first stage of the NRAR project, which focused on producing an initial set of numerical waveforms from binary black holes with moderate mass ratios and spins, as well as one non-spinning binary configuration which has a mass ratio of 10. All of the numerical waveforms are analysed in a uniform and consistent manner, with numerical errors evaluated using an analysis code created by members of the NRAR collaboration. We compare previously-calibrated, non-precessing analytical waveforms, notably the effective-one-body (EOB) and phenomenological template families, to the newly-produced numerical waveforms. We find that when the binary's total mass is ˜100-200M⊙, current EOB and phenomenological models of spinning, non-precessing binary waveforms have overlaps above 99% (for advanced LIGO) with all of the non-precessing-binary numerical waveforms with mass ratios ⩽4, when maximizing over binary parameters. This implies that the loss of event rate due to modelling error is below 3%. Moreover, the non-spinning EOB waveforms previously calibrated to five non-spinning waveforms with mass ratio smaller than 6 have overlaps above 99.7% with the numerical waveform with a mass ratio of 10, without even maximizing on the binary parameters.

  3. Monitoring the implementation of the WHO Global Code of Practice on the International Recruitment of Health Personnel.

    PubMed

    Siyam, Amani; Zurn, Pascal; Rø, Otto Christian; Gedik, Gulin; Ronquillo, Kenneth; Joan Co, Christine; Vaillancourt-Laflamme, Catherine; dela Rosa, Jennifer; Perfilieva, Galina; Dal Poz, Mario Roberto

    2013-11-01

    To present the findings of the first round of monitoring of the global implementation of the WHO Global Code of Practice on the International Recruitment of Health Personnel ("the Code"), a voluntary code adopted in 2010 by all 193 Member States of the World Health Organization (WHO). WHO requested that its Member States designate a national authority for facilitating information exchange on health personnel migration and the implementation of the Code. Each designated authority was then sent a cross-sectional survey with 15 questions on a range of topics pertaining to the 10 articles included in the Code. A national authority was designated by 85 countries. Only 56 countries reported on the status of Code implementation. Of these, 37 had taken steps towards implementing the Code, primarily by engaging relevant stakeholders. In 90% of countries, migrant health professionals reportedly enjoy the same legal rights and responsibilities as domestically trained health personnel. In the context of the Code, cooperation in the area of health workforce development goes beyond migration-related issues. An international comparative information base on health workforce mobility is needed but can only be developed through a collaborative, multi-partnered approach. Reporting on the implementation of the Code has been suboptimal in all but one WHO region. Greater collaboration among state and non-state actors is needed to raise awareness of the Code and reinforce its relevance as a potent framework for policy dialogue on ways to address the health workforce crisis.

  4. A comparison of theoretical and experimental pressure distributions for two advanced fighter wings

    NASA Technical Reports Server (NTRS)

    Haney, H. P.; Hicks, R. M.

    1981-01-01

    A comparison was made between experimental pressure distributions measured during testing of the Vought A-7 fighter and the theoretical predictions of four transonic potential flow codes. Isolated wind and three wing-body codes were used for comparison. All comparisons are for transonic Mach numbers and include both attached and separate flows. In general, the wing-body codes gave better agreement with the experiment than did the isolated wing code but, because of the greater complexity of the geometry, were found to be considerably more expensive and less reliable.

  5. CNEA/ANL collaboration program to develop an optimized version of DART validation and assessment by means of U{sub 3}Si{sub x} and U{sub 3}O{sub 8-}Al dispersed CNEA miniplate irradiation behavior.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solis, D.

    1998-10-16

    The DART code is based upon a thermomechanical model that can predict swelling, recrystallization, fuel-meat interdiffusion and other issues related with MTR dispersed FE behavior under irradiation. As a part of a common effort to develop an optimized version of DART, a comparison between DART predictions and CNEA miniplates irradiation experimental data was made. The irradiation took place during 1981-82 for U3O8 miniplates and 1985-86 for U{sub 3}Si{sub x} at Oak Ridge Research Reactor (ORR). The microphotographs were studied by means of IMAWIN 3.0 Image Analysis Code and different fission gas bubbles distributions were obtained. Also it was possible tomore » find and identify different morphologic zones. In both kinds of fuels, different phases were recognized, like particle peripheral zones with evidence of Al-U reaction, internal recrystallized zones and bubbles. A very good agreement between code prediction and irradiation results was found. The few discrepancies are due to local, fabrication and irradiation uncertainties, as the presence of U{sub 3}Si phase in U{sub 3}Si{sub 2} particles and effective burnup.« less

  6. Preliminary Three-Dimensional Simulation of Sediment and Cesium Transport in the Ogi Dam Reservoir using FLESCOT – Task 6, Subtask 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Onishi, Yasuo; Kurikami, Hiroshi; Yokuda, Satoru T.

    2014-03-28

    After the accident at the Fukushima Daiichi Nuclear Power Plant in March 2011, the Japan Atomic Energy Agency and the Pacific Northwest National Laboratory initiated a collaborative project on environmental restoration. In October 2013, the collaborative team started a task of three-dimensional modeling of sediment and cesium transport in the Fukushima environment using the FLESCOT (Flow, Energy, Salinity, Sediment Contaminant Transport) code. As the first trial, we applied it to the Ogi Dam Reservoir that is one of the reservoirs in the Japan Atomic Energy Agency’s (JAEA’s) investigation project. Three simulation cases under the following different temperature conditions were studied:more » • incoming rivers and the Ogi Dam Reservoir have the same water temperature • incoming rivers have lower water temperature than that of the reservoir • incoming rivers have higher water temperature than that of the reservoir. The preliminary simulations suggest that seasonal temperature changes influence the sediment and cesium transport. The preliminary results showed the following: • Suspended sand, and cesium adsorbed by sand, coming into the reservoirs from upstream rivers is deposited near the reservoir entrance. • Suspended silt, and cesium adsorbed by silt, is deposited farther in the reservoir. • Suspended clay, and cesium adsorbed by clay, travels the farthest into the reservoir. With sufficient time, the dissolved cesium reaches the downstream end of the reservoir. This preliminary modeling also suggests the possibility of a suitable dam operation to control the cesium migration farther downstream from the dam. JAEA has been sampling in the Ogi Dam Reservoir, but these data were not yet available for the current model calibration and validation for this reservoir. Nonetheless these preliminary FLESCOT modeling results were qualitatively valid and confirmed the applicability of the FLESCOT code to the Ogi Dam Reservoir, and in general to other reservoirs in the Fukushima environment. The issues to be addressed in future are the following: • Validate the simulation results by comparison with the investigation data. • Confirm the applicability of the FLESCOT code to Fukushima coastal areas. • Increase computation speed by parallelizing the FLESCOT code.« less

  7. Improving accuracy of clinical coding in surgery: collaboration is key.

    PubMed

    Heywood, Nick A; Gill, Michael D; Charlwood, Natasha; Brindle, Rachel; Kirwan, Cliona C

    2016-08-01

    Clinical coding data provide the basis for Hospital Episode Statistics and Healthcare Resource Group codes. High accuracy of this information is required for payment by results, allocation of health and research resources, and public health data and planning. We sought to identify the level of accuracy of clinical coding in general surgical admissions across hospitals in the Northwest of England. Clinical coding departments identified a total of 208 emergency general surgical patients discharged between 1st March and 15th August 2013 from seven hospital trusts (median = 20, range = 16-60). Blinded re-coding was performed by a senior clinical coder and clinician, with results compared with the original coding outcome. Recorded codes were generated from OPCS-4 & ICD-10. Of all cases, 194 of 208 (93.3%) had at least one coding error and 9 of 208 (4.3%) had errors in both primary diagnosis and primary procedure. Errors were found in 64 of 208 (30.8%) of primary diagnoses and 30 of 137 (21.9%) of primary procedure codes. Median tariff using original codes was £1411.50 (range, £409-9138). Re-calculation using updated clinical codes showed a median tariff of £1387.50, P = 0.997 (range, £406-10,102). The most frequent reasons for incorrect coding were "coder error" and a requirement for "clinical interpretation of notes". Errors in clinical coding are multifactorial and have significant impact on primary diagnosis, potentially affecting the accuracy of Hospital Episode Statistics data and in turn the allocation of health care resources and public health planning. As we move toward surgeon specific outcomes, surgeons should increase collaboration with coding departments to ensure the system is robust. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Compression performance of HEVC and its format range and screen content coding extensions

    NASA Astrophysics Data System (ADS)

    Li, Bin; Xu, Jizheng; Sullivan, Gary J.

    2015-09-01

    This paper presents a comparison-based test of the objective compression performance of the High Efficiency Video Coding (HEVC) standard, its format range extensions (RExt), and its draft screen content coding extensions (SCC). The current dominant standard, H.264/MPEG-4 AVC, is used as an anchor reference in the comparison. The conditions used for the comparison tests were designed to reflect relevant application scenarios and to enable a fair comparison to the maximum extent feasible - i.e., using comparable quantization settings, reference frame buffering, intra refresh periods, rate-distortion optimization decision processing, etc. It is noted that such PSNR-based objective comparisons generally provide more conservative estimates of HEVC benefit than are found in subjective studies. The experimental results show that, when compared with H.264/MPEG-4 AVC, HEVC version 1 provides a bit rate savings for equal PSNR of about 23% for all-intra coding, 34% for random access coding, and 38% for low-delay coding. This is consistent with prior studies and the general characterization that HEVC can provide about a bit rate savings of about 50% for equal subjective quality for most applications. The HEVC format range extensions provide a similar bit rate savings of about 13-25% for all-intra coding, 28-33% for random access coding, and 32-38% for low-delay coding at different bit rate ranges. For lossy coding of screen content, the HEVC screen content coding extensions achieve a bit rate savings of about 66%, 63%, and 61% for all-intra coding, random access coding, and low-delay coding, respectively. For lossless coding, the corresponding bit rate savings are about 40%, 33%, and 32%, respectively.

  9. A Comparison of Collaborative and Traditional Instruction in Higher Education

    ERIC Educational Resources Information Center

    Gubera, Chip; Aruguete, Mara S.

    2013-01-01

    Although collaborative instructional techniques have become popular in college courses, it is unclear whether collaborative techniques can replace more traditional instructional methods. We examined the efficacy of collaborative courses (in-class, collaborative activities with no lectures) compared to traditional lecture courses (in-class,…

  10. Finnish upper secondary students' collaborative processes in learning statistics in a CSCL environment

    NASA Astrophysics Data System (ADS)

    Kaleva Oikarinen, Juho; Järvelä, Sanna; Kaasila, Raimo

    2014-04-01

    This design-based research project focuses on documenting statistical learning among 16-17-year-old Finnish upper secondary school students (N = 78) in a computer-supported collaborative learning (CSCL) environment. One novel value of this study is in reporting the shift from teacher-led mathematical teaching to autonomous small-group learning in statistics. The main aim of this study is to examine how student collaboration occurs in learning statistics in a CSCL environment. The data include material from videotaped classroom observations and the researcher's notes. In this paper, the inter-subjective phenomena of students' interactions in a CSCL environment are analysed by using a contact summary sheet (CSS). The development of the multi-dimensional coding procedure of the CSS instrument is presented. Aptly selected video episodes were transcribed and coded in terms of conversational acts, which were divided into non-task-related and task-related categories to depict students' levels of collaboration. The results show that collaborative learning (CL) can facilitate cohesion and responsibility and reduce students' feelings of detachment in our classless, periodic school system. The interactive .pdf material and collaboration in small groups enable statistical learning. It is concluded that CSCL is one possible method of promoting statistical teaching. CL using interactive materials seems to foster and facilitate statistical learning processes.

  11. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  12. Delving into Teacher Collaboration: Untangling Problems and Solutions for Leadership

    ERIC Educational Resources Information Center

    Gates, Gordon; Robinson, Sharon

    2009-01-01

    This article offers description and interpretation for understanding the exercise of leadership in teacher collaboration. Data gathered in two urban high schools through observations and interviews were coded and categorized following Miles and Huberman's modified analytic induction technique. The analysis contributes to emerging theory on…

  13. Dual Coding, Reasoning and Fallacies.

    ERIC Educational Resources Information Center

    Hample, Dale

    1982-01-01

    Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)

  14. Grid Computing and Collaboration Technology in Support of Fusion Energy Sciences

    NASA Astrophysics Data System (ADS)

    Schissel, D. P.

    2004-11-01

    The SciDAC Initiative is creating a computational grid designed to advance scientific understanding in fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling, and allowing more efficient use of experimental facilities. The philosophy is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as easy to use network available services. Access to services is stressed rather than portability. Services share the same basic security infrastructure so that stakeholders can control their own resources and helps ensure fair use of resources. The collaborative control room is being developed using the open-source Access Grid software that enables secure group-to-group collaboration with capabilities beyond teleconferencing including application sharing and control. The ability to effectively integrate off-site scientists into a dynamic control room will be critical to the success of future international projects like ITER. Grid computing, the secure integration of computer systems over high-speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. The first grid computational service deployed was the transport code TRANSP and included tools for run preparation, submission, monitoring and management. This approach saves user sites from the laborious effort of maintaining a complex code while at the same time reducing the burden on developers by avoiding the support of a large number of heterogeneous installations. This tutorial will present the philosophy behind an advanced collaborative environment, give specific examples, and discuss its usage beyond FES.

  15. Supporting Interdisciplinary Collaboration Through Reusable Free Software. A Research Student Experience

    NASA Astrophysics Data System (ADS)

    Dimech, C.

    2013-12-01

    In this contribution, I present a critical evaluation of my experience as a research student conducting an interdisciplinary project that bridges the world of geoscience with that of astronomy. The major challenge consists in studying and modifying existing geophysical software to work with synthetic solar data not obtained by direct measurement but useful for testing and evaluation, and data released from the satellite HINODE and the Solar Dynamics Observatory. I have been fortunate to collaborate closely with multiple geoscientists keen to share their software codes and help me understand their implementations so I can extend the methodology to solve problems in solar physics. Moreover, two additional experiences have helped me develop my research and collaborative skills. First was an opportunity to involve an undergraduate student, and secondly, my participation at the GNU Hackers Meeting in Paris. Three aspects that need particular attention to enhance the collective productivity of any group of individuals keen to extend existing codes to achieve further interdisciplinary goals have been identified. (1) The production of easily reusable code that users can study and modify even when large sets of computations are involved. (2) The transformation of solutions into tools that are 100% free software. (3) The harmonisation of collaborative interactions that effectively tackle the two aforementioned tasks. Each one will be discussed in detail during this session based on my experience as a research student.

  16. 75 FR 4689 - Electronic Tariff Filings

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-29

    ... collaborative process relies upon the use of metadata (or information) about the tariff filing, including such... code.\\5\\ Because the Commission is using the electronic metadata to establish statutory action dates... code, as well as accurately providing any other metadata. 6. Similarly, the Commission will be using...

  17. Comparative study on collaborative interaction in non-immersive and immersive systems

    NASA Astrophysics Data System (ADS)

    Shahab, Qonita M.; Kwon, Yong-Moo; Ko, Heedong; Mayangsari, Maria N.; Yamasaki, Shoko; Nishino, Hiroaki

    2007-09-01

    This research studies the Virtual Reality simulation for collaborative interaction so that different people from different places can interact with one object concurrently. Our focus is the real-time handling of inputs from multiple users, where object's behavior is determined by the combination of the multiple inputs. Issues addressed in this research are: 1) The effects of using haptics on a collaborative interaction, 2) The possibilities of collaboration between users from different environments. We conducted user tests on our system in several cases: 1) Comparison between non-haptics and haptics collaborative interaction over LAN, 2) Comparison between non-haptics and haptics collaborative interaction over Internet, and 3) Analysis of collaborative interaction between non-immersive and immersive display environments. The case studies are the interaction of users in two cases: collaborative authoring of a 3D model by two users, and collaborative haptic interaction by multiple users. In Virtual Dollhouse, users can observe physics law while constructing a dollhouse using existing building blocks, under gravity effects. In Virtual Stretcher, multiple users can collaborate on moving a stretcher together while feeling each other's haptic motions.

  18. Monitoring the implementation of the WHO Global Code of Practice on the International Recruitment of Health Personnel

    PubMed Central

    Zurn, Pascal; Rø, Otto Christian; Gedik, Gulin; Ronquillo, Kenneth; Joan Co, Christine; Vaillancourt-Laflamme, Catherine; dela Rosa, Jennifer; Perfilieva, Galina; Dal Poz, Mario Roberto

    2013-01-01

    Abstract Objective To present the findings of the first round of monitoring of the global implementation of the WHO Global Code of Practice on the International Recruitment of Health Personnel (“the Code”), a voluntary code adopted in 2010 by all 193 Member States of the World Health Organization (WHO). Methods WHO requested that its Member States designate a national authority for facilitating information exchange on health personnel migration and the implementation of the Code. Each designated authority was then sent a cross-sectional survey with 15 questions on a range of topics pertaining to the 10 articles included in the Code. Findings A national authority was designated by 85 countries. Only 56 countries reported on the status of Code implementation. Of these, 37 had taken steps towards implementing the Code, primarily by engaging relevant stakeholders. In 90% of countries, migrant health professionals reportedly enjoy the same legal rights and responsibilities as domestically trained health personnel. In the context of the Code, cooperation in the area of health workforce development goes beyond migration-related issues. An international comparative information base on health workforce mobility is needed but can only be developed through a collaborative, multi-partnered approach. Conclusion Reporting on the implementation of the Code has been suboptimal in all but one WHO region. Greater collaboration among state and non-state actors is needed to raise awareness of the Code and reinforce its relevance as a potent framework for policy dialogue on ways to address the health workforce crisis. PMID:24347705

  19. Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.

    2013-12-01

    Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.

  20. Extending software repository hosting to code review and testing

    NASA Astrophysics Data System (ADS)

    Gonzalez Alvarez, A.; Aparicio Cotarelo, B.; Lossent, A.; Andersen, T.; Trzcinska, A.; Asbury, D.; Hłimyr, N.; Meinhard, H.

    2015-12-01

    We will describe how CERN's services around Issue Tracking and Version Control have evolved, and what the plans for the future are. We will describe the services main design, integration and structure, giving special attention to the new requirements from the community of users in terms of collaboration and integration tools and how we address this challenge when defining new services based on GitLab for collaboration to replace our current Gitolite service and Code Review and Jenkins for Continuous Integration. These new services complement the existing ones to create a new global "development tool stack" where each working group can place its particular development work-flow.

  1. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts.

    PubMed

    Ravagli, Carlo; Pognan, Francois; Marc, Philippe

    2017-01-01

    The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com This software is designed to run on a Java EE application server and store data in a relational database. philippe.marc@novartis.com. © The Author 2016. Published by Oxford University Press.

  2. OntoBrowser: a collaborative tool for curation of ontologies by subject matter experts

    PubMed Central

    Ravagli, Carlo; Pognan, Francois

    2017-01-01

    Summary: The lack of controlled terminology and ontology usage leads to incomplete search results and poor interoperability between databases. One of the major underlying challenges of data integration is curating data to adhere to controlled terminologies and/or ontologies. Finding subject matter experts with the time and skills required to perform data curation is often problematic. In addition, existing tools are not designed for continuous data integration and collaborative curation. This results in time-consuming curation workflows that often become unsustainable. The primary objective of OntoBrowser is to provide an easy-to-use online collaborative solution for subject matter experts to map reported terms to preferred ontology (or code list) terms and facilitate ontology evolution. Additional features include web service access to data, visualization of ontologies in hierarchical/graph format and a peer review/approval workflow with alerting. Availability and implementation: The source code is freely available under the Apache v2.0 license. Source code and installation instructions are available at http://opensource.nibr.com. This software is designed to run on a Java EE application server and store data in a relational database. Contact: philippe.marc@novartis.com PMID:27605099

  3. Two Tales of Time: Uncovering the Significance of Sequential Patterns among Contribution Types in Knowledge-Building Discourse

    ERIC Educational Resources Information Center

    Chen, Bodong; Resendes, Monica; Chai, Ching Sing; Hong, Huang-Yao

    2017-01-01

    As collaborative learning is actualized through evolving dialogues, temporality inevitably matters for the analysis of collaborative learning. This study attempts to uncover sequential patterns that distinguish "productive" threads of knowledge-building discourse. A database of Grade 1-6 knowledge-building discourse was first coded for…

  4. Graphing in Groups: Learning about Lines in a Collaborative Classroom Network Environment

    ERIC Educational Resources Information Center

    White, Tobin; Wallace, Matthew; Lai, Kevin

    2012-01-01

    This article presents a design experiment in which we explore new structures for classroom collaboration supported by a classroom network of handheld graphing calculators. We describe a design for small group investigations of linear functions and present findings from its implementation in three high school algebra classrooms. Our coding of the…

  5. Learning about the Benetic Code via Programming: Representing the Process of Translation.

    ERIC Educational Resources Information Center

    Ploger, Don

    1991-01-01

    This study examined the representations that a 16-year-old student made using the flexible computer system, "Boxer," in learning the genetic code. Results indicated that programing made it easier to build and explore flexible and useful representations and encouraged interdisciplinary collaboration between mathematics and biology…

  6. Researching "Race," Racism and Antiracism: The Development of an Ethical Code.

    ERIC Educational Resources Information Center

    Netto, Gina; Diniz, Fernando Almeida

    2001-01-01

    Black researchers collaborated to establish the Scottish Association of Black Researchers (SABRE) and developed "An Ethical Code for Researching Race, Racism and Anti-Racism in Scotland." Highlights the neglect of this issue by mainstream research, the importance of research on race in Scotland, and fundamental questions about the…

  7. A Coding Scheme to Analyse the Online Asynchronous Discussion Forums of University Students

    ERIC Educational Resources Information Center

    Biasutti, Michele

    2017-01-01

    The current study describes the development of a content analysis coding scheme to examine transcripts of online asynchronous discussion groups in higher education. The theoretical framework comprises the theories regarding knowledge construction in computer-supported collaborative learning (CSCL) based on a sociocultural perspective. The coding…

  8. Piloting a Collaborative Web-Based System for Testing ICD-11.

    PubMed

    Donada, Marc; Kostanjsek, Nenad; Della Mea, Vincenzo; Celik, Can; Jakob, Robert

    2017-01-01

    The 11th revision of the International Classification of Diseases (ICD-11), for the first time in ICD history, deployed web-based collaboration of experts and ICT tools. To ensure that ICD-11 is working well, it needs to be systematically field tested in different settings, across the world. This will be done by means of a number of experiments. In order to support its implementation, a web-based system (ICDfit) has been designed and developed. The present paper illustrates the current prototype of the system and its technical testing. the system has been designed according to WHO requirements, and implemented using PHP and MySQL. Then, a preliminary technical test has been designed and run in January 2016, involving 8 users. They had to carry out double coding, that is, coding case summaries with both ICD-10 and ICD-11, and answering quick questions on the coding difficulty. the 8 users coded 632 cases each, spending an average of 163 seconds per case. While we found an issue in the mechanism used to record coding times, no further issues were found. the proposed system seems to be technically adequate for supporting future ICD-11 testing.

  9. Automated UMLS-Based Comparison of Medical Forms

    PubMed Central

    Dugas, Martin; Fritz, Fleur; Krumm, Rainer; Breil, Bernhard

    2013-01-01

    Medical forms are very heterogeneous: on a European scale there are thousands of data items in several hundred different systems. To enable data exchange for clinical care and research purposes there is a need to develop interoperable documentation systems with harmonized forms for data capture. A prerequisite in this harmonization process is comparison of forms. So far – to our knowledge – an automated method for comparison of medical forms is not available. A form contains a list of data items with corresponding medical concepts. An automatic comparison needs data types, item names and especially item with these unique concept codes from medical terminologies. The scope of the proposed method is a comparison of these items by comparing their concept codes (coded in UMLS). Each data item is represented by item name, concept code and value domain. Two items are called identical, if item name, concept code and value domain are the same. Two items are called matching, if only concept code and value domain are the same. Two items are called similar, if their concept codes are the same, but the value domains are different. Based on these definitions an open-source implementation for automated comparison of medical forms in ODM format with UMLS-based semantic annotations was developed. It is available as package compareODM from http://cran.r-project.org. To evaluate this method, it was applied to a set of 7 real medical forms with 285 data items from a large public ODM repository with forms for different medical purposes (research, quality management, routine care). Comparison results were visualized with grid images and dendrograms. Automated comparison of semantically annotated medical forms is feasible. Dendrograms allow a view on clustered similar forms. The approach is scalable for a large set of real medical forms. PMID:23861827

  10. Poisonings in the Nordic countries in 2007: a 5-year epidemiological follow-up.

    PubMed

    Andrew, Erik; Tellerup, Markus; Termälä, Anna-Mariia; Jacobsen, Peter; Gudjonsdottir, Gudborg A

    2012-03-01

    To map mortality and morbidity of poisonings in Denmark, Finland, Iceland, Norway and Sweden in 2007 and undertake a comparison with a corresponding study in 2002. Morbidity was as for 2002 defined as acute poisoning (ICD-10 codes, main and subsidiary diagnoses) treated in hospitals. The figures were extracted from the National Patient/Hospital Registers. Deaths recorded as acute poisoning (using corresponding ICD-10 codes) were collected from the National Cause of Death Registers. Annual mortality of acute poisonings per 100,000 inhabitants (rate) for 2007 was 22.4 in Finland, an important increase from 16.7 per 100,000 in 2002. The increase was mainly due to a change in coding of alcohol, but also represented a slight increase in fatal alcohol intoxications per se. The poisoning death rate in the other Nordic countries varied between 8-13 and was at the same level as for 2002. The morbidity rates for 2007 between 158-285 per 100,000 inhabitants represented a slight increase compared to 2002 figures. The increase in poisoning death rate for alcohol, and thus total rate in Finland in 2007 compared to 2002, has further increased the gap to the other Nordic countries. Poisoning morbidity rates in the Nordic countries are of the same level, but the variability shown indicates that more harmonization and collaboration is needed to increase the data quality.

  11. Development of tools and techniques for momentum compression of fast rare isotopes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David J. Morrissey; Bradley M. Sherrill; Oleg Tarasov

    2010-11-21

    As part of our past research and development work, we have created and developed the LISE++ simulation code [Tar04, Tar08]. The LISE++ package was significantly extended with the addition of a Monte Carlo option that includes an option for calculating ion trajectories using a Taylor-series expansion up to fifth order, and implementation of the MOTER Monte Carlo code [Kow87] for ray tracing of the ions into the suite of LISE++ codes. The MOTER code was rewritten from FORTRAN into C++ and transported to the MS-Windows operating system. Extensive work went into the creation of a user-friendly interface for the code.more » An example of the graphical user interface created for the MOTER code is shown in the left panel of Figure 1 and the results of a typical calculation for the trajectories of particles that pass through the A1900 fragment separator are shown in the right panel. The MOTER code is presently included as part of the LISE++ package for downloading without restriction by the worldwide community. The LISE++ was extensively developed and generalized to apply to any projectile fragment separator during the early phase of this grant. In addition to the inclusion of the MOTER code, other important additions to the LISE++ code made during FY08/FY09 are listed. The LISE++ is distributed over the web (http://groups.nscl.msu.edu/lise ) and is available without charge to anyone by anonymous download, thus, the number of individual users is not recorded. The number of 'hits' on the servers that provide the LISE++ code is shown in Figure 3 for the last eight calendar years (left panel) along with the country from the IP address (right panel). The data show an increase in web-activity with the release of the new version of the program during the grant period and a worldwide impact. An important part of the proposed work carried out during FY07, FY08 and FY09 by a graduate student in the MSU Physics program was to benchmark the codes by comparison of detailed measurements to the LISE++ predictions. A large data set was obtained for fission fragments from the reaction of 238U ions at 81 MeV/u in a 92 mg/cm2 beryllium target with the A1900 projectile fragment separator. The data were analyzed and form the bulk of a Ph.D. dissertation that is nearing completion. The rich data set provides a number of benchmarks for the improved LISE++ code and only a few examples can be shown here. The primary information obtained from the measurements is the yield of the products as a function of mass, charge and momentum. Examples of the momentum distributions of individually identified fragments can be seen in Figures 2 and 4 along with comparisons to the predicted distributions. The agreement is remarkably good and indicates the general validity of the model of the nuclear reactions producing these fragments and of the higher order transmission calculations in the LISE++ code. The momentum distributions were integrated to provide the cross sections for the individual isotopes. As shown in Figure 5, there is good agreement with the model predictions although the observed cross sections are a factor of five or so higher in this case. Other comparisons of measured production cross sections from abrasion-fission reactions have been published by our group working at the NSCL during this period [Fol09] and through our collaboration with Japanese researchers working at RIKEN with the BigRIPS separator [Ohn08, Ohn10]. The agreement of the model predictions with the data obtained with two different fragment separators is very good and indicates the usefulness of the new LISE++ code.« less

  12. FAST Model Calibration and Validation of the OC5- DeepCwind Floating Offshore Wind System Against Wave Tank Test Data: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Robertson, Amy N; Jonkman, Jason

    During the course of the Offshore Code Comparison Collaboration, Continued, with Correlation (OC5) project, which focused on the validation of numerical methods through comparison against tank test data, the authors created a numerical FAST model of the 1:50-scale DeepCwind semisubmersible system that was tested at the Maritime Research Institute Netherlands ocean basin in 2013. This paper discusses several model calibration studies that were conducted to identify model adjustments that improve the agreement between the numerical simulations and the experimental test data. These calibration studies cover wind-field-specific parameters (coherence, turbulence), hydrodynamic and aerodynamic modeling approaches, as well as rotor model (blade-pitchmore » and blade-mass imbalances) and tower model (structural tower damping coefficient) adjustments. These calibration studies were conducted based on relatively simple calibration load cases (wave only/wind only). The agreement between the final FAST model and experimental measurements is then assessed based on more-complex combined wind and wave validation cases.« less

  13. PlasmaPy: beginning a community developed Python package for plasma physics

    NASA Astrophysics Data System (ADS)

    Murphy, Nicholas A.; Huang, Yi-Min; PlasmaPy Collaboration

    2016-10-01

    In recent years, researchers in several disciplines have collaborated on community-developed open source Python packages such as Astropy, SunPy, and SpacePy. These packages provide core functionality, common frameworks for data analysis and visualization, and educational tools. We propose that our community begins the development of PlasmaPy: a new open source core Python package for plasma physics. PlasmaPy could include commonly used functions in plasma physics, easy-to-use plasma simulation codes, Grad-Shafranov solvers, eigenmode solvers, and tools to analyze both simulations and experiments. The development will include modern programming practices such as version control, embedding documentation in the code, unit tests, and avoiding premature optimization. We will describe early code development on PlasmaPy, and discuss plans moving forward. The success of PlasmaPy depends on active community involvement and a welcoming and inclusive environment, so anyone interested in joining this collaboration should contact the authors.

  14. Collaborative Simulation Grid: Multiscale Quantum-Mechanical/Classical Atomistic Simulations on Distributed PC Clusters in the US and Japan

    NASA Technical Reports Server (NTRS)

    Kikuchi, Hideaki; Kalia, Rajiv; Nakano, Aiichiro; Vashishta, Priya; Iyetomi, Hiroshi; Ogata, Shuji; Kouno, Takahisa; Shimojo, Fuyuki; Tsuruta, Kanji; Saini, Subhash; hide

    2002-01-01

    A multidisciplinary, collaborative simulation has been performed on a Grid of geographically distributed PC clusters. The multiscale simulation approach seamlessly combines i) atomistic simulation backed on the molecular dynamics (MD) method and ii) quantum mechanical (QM) calculation based on the density functional theory (DFT), so that accurate but less scalable computations are performed only where they are needed. The multiscale MD/QM simulation code has been Grid-enabled using i) a modular, additive hybridization scheme, ii) multiple QM clustering, and iii) computation/communication overlapping. The Gridified MD/QM simulation code has been used to study environmental effects of water molecules on fracture in silicon. A preliminary run of the code has achieved a parallel efficiency of 94% on 25 PCs distributed over 3 PC clusters in the US and Japan, and a larger test involving 154 processors on 5 distributed PC clusters is in progress.

  15. Comparison of SAND-II and FERRET

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootan, D.W.; Schmittroth, F.

    1981-01-01

    A comparison was made of the advantages and disadvantages of two codes, SAND-II and FERRET, for determining the neutron flux spectrum and uncertainty from experimental dosimeter measurements as anticipated in the FFTF Reactor Characterization Program. This comparison involved an examination of the methodology and the operational performance of each code. The merits of each code were identified with respect to theoretical basis, directness of method, solution uniqueness, subjective influences, and sensitivity to various input parameters.

  16. Links between Characteristics of Collaborative Peer Video Analysis Events and Literacy Teachers' Outcomes

    ERIC Educational Resources Information Center

    Arya, Poonam; Christ, Tanya; Chiu, Ming

    2015-01-01

    This study examined how characteristics of Collaborative Peer Video Analysis (CPVA) events are related to teachers' pedagogical outcomes. Data included 39 transcribed literacy video events, in which 14 in-service teachers engaged in discussions of their video clips. Emergent coding and Statistical Discourse Analysis were used to analyze the data.…

  17. The Effectiveness of Using Procedural Scaffoldings in a Paper-Plus-Smartphone Collaborative Learning Context

    ERIC Educational Resources Information Center

    Huang, Hui-Wen; Wu, Chih-Wei; Chen, Nian-Shing

    2012-01-01

    The purpose of this study was to evaluate the effectiveness of using procedural scaffoldings in fostering students' group discourse levels and learning outcomes in a paper-plus-smartphone collaborative learning context. All participants used built-in camera smartphones to learn new knowledge by scanning Quick Response (QR) codes, a type of…

  18. Stakeholder perceptions of collaboration for managing nature-based recreation in a coastal protected area in Alaska

    Treesearch

    Emily F. Pomeranz; Mark D. Needham; Linda E. Kruger

    2013-01-01

    Voluntary codes of conduct and best management practices are increasingly popular methods for addressing impacts of recreation and tourism in protected areas. In southeast Alaska, for example, a collaborative stakeholder process has been used for creating, implementing, and managing the voluntary Wilderness Best Management Practices (WBMP) for the Tracy Arm- Fords...

  19. Tracing Ideologies of Learning in Group Talk and Their Impediments to Collaboration

    ERIC Educational Resources Information Center

    Anderson, Kate T.; Weninger, Csilla

    2012-01-01

    In this paper we examine the complex relationship between dynamics of group talk and students' ideologies of learning. Through an interactional analysis and thematic coding of group talk, this study details barriers to collaboration in a digital storytelling workshop with primary-aged youth in Singapore. Drawing on 25 h of video-recorded data, we…

  20. Astrophysics Source Code Library -- Now even better!

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Schmidt, Judy; Berriman, Bruce; DuPrie, Kimberly; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2015-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. Indexed by ADS, it now contains nearly 1,000 codes and with recent major changes, is better than ever! The resource has a new infrastructure that offers greater flexibility and functionality for users, including an easier submission process, better browsing, one-click author search, and an RSS feeder for news. The new database structure is easier to maintain and offers new possibilities for collaboration. Come see what we've done!

  1. Using Inspections to Improve the Quality of Product Documentation and Code.

    ERIC Educational Resources Information Center

    Zuchero, John

    1995-01-01

    Describes how, by adapting software inspections to assess documentation and code, technical writers can collaborate with development personnel, editors, and customers to dramatically improve both the quality of documentation and the very process of inspecting that documentation. Notes that the five steps involved in the inspection process are:…

  2. Orientation to Language Code and Actions in Group Work

    ERIC Educational Resources Information Center

    Aline, David; Hosoda, Yuri

    2009-01-01

    This conversation analytic study reveals how learners themselves, as speakers and listeners, demonstrate their own orientation to language code and actions on a moment by moment basis during collaborative tasks in English as a foreign language classrooms. The excerpts presented in this article were drawn from 23 hours of audio- and video-recorded…

  3. Constructing a classification of hypersensitivity/allergic diseases for ICD-11 by crowdsourcing the allergist community.

    PubMed

    Tanno, L K; Calderon, M A; Goldberg, B J; Gayraud, J; Bircher, A J; Casale, T; Li, J; Sanchez-Borges, M; Rosenwasser, L J; Pawankar, R; Papadopoulos, N G; Demoly, P

    2015-06-01

    The global allergy community strongly believes that the 11th revision of the International Classification of Diseases (ICD-11) offers a unique opportunity to improve the classification and coding of hypersensitivity/allergic diseases via inclusion of a specific chapter dedicated to this disease area to facilitate epidemiological studies, as well as to evaluate the true size of the allergy epidemic. In this context, an international collaboration has decided to revise the classification of hypersensitivity/allergic diseases and to validate it for ICD-11 by crowdsourcing the allergist community. After careful comparison between ICD-10 and 11 beta phase linearization codes, we identified gaps and trade-offs allowing us to construct a classification proposal, which was sent to the European Academy of Allergy and Clinical Immunology (EAACI) sections, interest groups, executive committee as well as the World Allergy Organization (WAO), and American Academy of Allergy Asthma and Immunology (AAAAI) leaderships. The crowdsourcing process produced comments from 50 of 171 members contacted by e-mail. The classification proposal has also been discussed at face-to-face meetings with experts of EAACI sections and interest groups and presented in a number of business meetings during the 2014 EAACI annual congress in Copenhagen. As a result, a high-level complex structure of classification for hypersensitivity/allergic diseases has been constructed. The model proposed has been presented to the WHO groups in charge of the ICD revision. The international collaboration of allergy experts appreciates bilateral discussion and aims to get endorsement of their proposals for the final ICD-11. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  4. The effect of playing a science center-based mobile game: Affective outcomes and gender differences

    NASA Astrophysics Data System (ADS)

    Atwood-Blaine, Dana

    Situated in a hands-on science center, The Great STEM Caper was a collaborative mobile game built on the ARIS platform that was designed to engage 5th-9th grade players in NGSS science and engineering practices while they interacted with various exhibits. Same gender partners sharing one iPad would search for QR codes placed at specific exhibits; scanning a code within the game would launch a challenge for that exhibit. The primary hypothesis was that in- game victories would be equivalent to "mastery experiences" as described by Bandura (1997) and would result in increased science self-efficacy. Gender differences in gameplay behaviors and perceptions were also studied. The study included two groups, one that played the game during their visit and one that explored the science center in the traditional way. The Motivation to Learn Science Questionnaire was administered to participants in both groups both before and after their visit to the science center. Participants wore head-mounted GoPro cameras to record their interactions within the physical and social environment. No differences in affective outcomes were found between the game and comparison groups or between boys and girls in the game group. The MLSQ was unable to measure any significant change in science self-efficacy, interest and enjoyment of science, or overall motivation to learn science in either group. However, girls outperformed boys on every measure of game achievement. Lazzaro's (2004) four types of fun were found to be a good fit for describing the gender differences in game perceptions and behaviors. Girls tended to enjoy hard fun and collaborative people fun while boys enjoyed easy fun and competitive people fun. While boys associated game achievement with enjoyment and victory, girls perceived their game achievement as difficult, rather than enjoyable or victorious.

  5. Towards actionable international comparisons of health system performance: expert revision of the OECD framework and quality indicators.

    PubMed

    Carinci, F; Van Gool, K; Mainz, J; Veillard, J; Pichora, E C; Januel, J M; Arispe, I; Kim, S M; Klazinga, N S

    2015-04-01

    To review and update the conceptual framework, indicator content and research priorities of the Organisation for Economic Cooperation and Development's (OECD) Health Care Quality Indicators (HCQI) project, after a decade of collaborative work. A structured assessment was carried out using a modified Delphi approach, followed by a consensus meeting, to assess the suite of HCQI for international comparisons, agree on revisions to the original framework and set priorities for research and development. International group of countries participating to OECD projects. Members of the OECD HCQI expert group. A reference matrix, based on a revised performance framework, was used to map and assess all seventy HCQI routinely calculated by the OECD expert group. A total of 21 indicators were agreed to be excluded, due to the following concerns: (i) relevance, (ii) international comparability, particularly where heterogeneous coding practices might induce bias, (iii) feasibility, when the number of countries able to report was limited and the added value did not justify sustained effort and (iv) actionability, for indicators that were unlikely to improve on the basis of targeted policy interventions. The revised OECD framework for HCQI represents a new milestone of a long-standing international collaboration among a group of countries committed to building common ground for performance measurement. The expert group believes that the continuation of this work is paramount to provide decision makers with a validated toolbox to directly act on quality improvement strategies. © The Author 2015. Published by Oxford University Press in association with the International Society for Quality in Health Care; all rights reserved.

  6. A Comparison of the Efficacy of Individual and Collaborative Music Learning in Ensemble Rehearsals

    ERIC Educational Resources Information Center

    Brandler, Brian J.; Peynircioglu, Zehra F.

    2015-01-01

    Collaboration is essential in learning ensemble music. It is unclear, however, whether an individual benefits more from collaborative or individual rehearsal in the initial stages of such learning. In nonmusical domains, the effect of collaboration has been mixed, sometimes enhancing and sometimes inhibiting an individual's learning process. In…

  7. Center for Extended Magnetohydrodynamics Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramos, Jesus

    This researcher participated in the DOE-funded Center for Extended Magnetohydrodynamics Modeling (CEMM), a multi-institutional collaboration led by the Princeton Plasma Physics Laboratory with Dr. Stephen Jardin as the overall Principal Investigator. This project developed advanced simulation tools to study the non-linear macroscopic dynamics of magnetically confined plasmas. The collaborative effort focused on the development of two large numerical simulation codes, M3D-C1 and NIMROD, and their application to a wide variety of problems. Dr. Ramos was responsible for theoretical aspects of the project, deriving consistent sets of model equations applicable to weakly collisional plasmas and devising test problems for verification ofmore » the numerical codes. This activity was funded for twelve years.« less

  8. Description of Transport Codes for Space Radiation Shielding

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Wilson, John W.; Cucinotta, Francis A.

    2011-01-01

    This slide presentation describes transport codes and their use for studying and designing space radiation shielding. When combined with risk projection models radiation transport codes serve as the main tool for study radiation and designing shielding. There are three criteria for assessing the accuracy of transport codes: (1) Ground-based studies with defined beams and material layouts, (2) Inter-comparison of transport code results for matched boundary conditions and (3) Comparisons to flight measurements. These three criteria have a very high degree with NASA's HZETRN/QMSFRG.

  9. Comparison of laser anemometer measurements and theory in an annular turbine cascade with experimental accuracy determined by parameter estimation

    NASA Technical Reports Server (NTRS)

    Goldman, L. J.; Seasholtz, R. G.

    1982-01-01

    Experimental measurements of the velocity components in the blade to blade (axial tangential) plane were obtained with an axial flow turbine stator passage and were compared with calculations from three turbomachinery computer programs. The theoretical results were calculated from a quasi three dimensional inviscid code, a three dimensional inviscid code, and a three dimensional viscous code. Parameter estimation techniques and a particle dynamics calculation were used to assess the accuracy of the laser measurements, which allow a rational basis for comparison of the experimenal and theoretical results. The general agreement of the experimental data with the results from the two inviscid computer codes indicates the usefulness of these calculation procedures for turbomachinery blading. The comparison with the viscous code, while generally reasonable, was not as good as for the inviscid codes.

  10. In-Depth Analysis of Simulation Engine Codes for Comparison with DOE s Roof Savings Calculator and Measured Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    New, Joshua Ryan; Levinson, Ronnen; Huang, Yu

    The Roof Savings Calculator (RSC) was developed through collaborations among Oak Ridge National Laboratory (ORNL), White Box Technologies, Lawrence Berkeley National Laboratory (LBNL), and the Environmental Protection Agency in the context of a California Energy Commission Public Interest Energy Research project to make cool-color roofing materials a market reality. The RSC website and a simulation engine validated against demonstration homes were developed to replace the liberal DOE Cool Roof Calculator and the conservative EPA Energy Star Roofing Calculator, which reported different roof savings estimates. A preliminary analysis arrived at a tentative explanation for why RSC results differed from previous LBNLmore » studies and provided guidance for future analysis in the comparison of four simulation programs (doe2attic, DOE-2.1E, EnergyPlus, and MicroPas), including heat exchange between the attic surfaces (principally the roof and ceiling) and the resulting heat flows through the ceiling to the building below. The results were consolidated in an ORNL technical report, ORNL/TM-2013/501. This report is an in-depth inter-comparison of four programs with detailed measured data from an experimental facility operated by ORNL in South Carolina in which different segments of the attic had different roof and attic systems.« less

  11. Comparison of Transport Codes, HZETRN, HETC and FLUKA, Using 1977 GCR Solar Minimum Spectra

    NASA Technical Reports Server (NTRS)

    Heinbockel, John H.; Slaba, Tony C.; Tripathi, Ram K.; Blattnig, Steve R.; Norbury, John W.; Badavi, Francis F.; Townsend, Lawrence W.; Handler, Thomas; Gabriel, Tony A.; Pinsky, Lawrence S.; hide

    2009-01-01

    The HZETRN deterministic radiation transport code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events (SPE) on mission planning, astronaut shielding and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC-HEDS and FLUKA, and the deterministic transport code, HZETRN. Each code is used to transport ions from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 Aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton fluxes, and the forward, backward and total neutron fluxes at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.

  12. Complex collaborative problem-solving processes in mission control.

    PubMed

    Fiore, Stephen M; Wiltshire, Travis J; Oglesby, James M; O'Keefe, William S; Salas, Eduardo

    2014-04-01

    NASA's Mission Control Center (MCC) is responsible for control of the International Space Station (ISS), which includes responding to problems that obstruct the functioning of the ISS and that may pose a threat to the health and well-being of the flight crew. These problems are often complex, requiring individuals, teams, and multiteam systems, to work collaboratively. Research is warranted to examine individual and collaborative problem-solving processes in this context. Specifically, focus is placed on how Mission Control personnel-each with their own skills and responsibilities-exchange information to gain a shared understanding of the problem. The Macrocognition in Teams Model describes the processes that individuals and teams undertake in order to solve problems and may be applicable to Mission Control teams. Semistructured interviews centering on a recent complex problem were conducted with seven MCC professionals. In order to assess collaborative problem-solving processes in MCC with those predicted by the Macrocognition in Teams Model, a coding scheme was developed to analyze the interview transcriptions. Findings are supported with excerpts from participant transcriptions and suggest that team knowledge-building processes accounted for approximately 50% of all coded data and are essential for successful collaborative problem solving in mission control. Support for the internalized and externalized team knowledge was also found (19% and 20%, respectively). The Macrocognition in Teams Model was shown to be a useful depiction of collaborative problem solving in mission control and further research with this as a guiding framework is warranted.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Viktor K. Decyk

    The UCLA work on this grant was to design and help implement an object-oriented version of the GTC code, which is written in Fortran90. The GTC code is the main global gyrokinetic code used in this project, and over the years multiple, incompatible versions have evolved. The reason for this effort is to allow multiple authors to work together on GTC and to simplify future enhancements to GTC. The effort was designed to proceed incrementally. Initially, an upper layer of classes (derived types and methods) was implemented which called the original GTC code 'under the hood.' The derived types pointedmore » to data in the original GTC code, and the methods called the original GTC subroutines. The original GTC code was modified only very slightly. This allowed one to define (and refine) a set of classes which described the important features of the GTC code in a new, more abstract way, with a minimum of implementation. Furthermore, classes could be added one at a time, and at the end of the each day, the code continued to work correctly. This work was done in close collaboration with Y. Nishimura from UC Irvine and Stefan Ethier from PPPL. Ten classes were ultimately defined and implemented: gyrokinetic and drift kinetic particles, scalar and vector fields, a mesh, jacobian, FLR, equilibrium, interpolation, and particles species descriptors. In the second state of this development, some of the scaffolding was removed. The constructors in the class objects now allocated the data and the array data in the original GTC code was removed. This isolated the components and now allowed multiple instantiations of the objects to be created, in particular, multiple ion species. Again, the work was done incrementally, one class at a time, so that the code was always working properly. This work was done in close collaboration with Y. Nishimura and W. Zhang from UC Irvine and Stefan Ethier from PPPL. The third stage of this work was to integrate the capabilities of the various versions of the GTC code into one flexible and extensible version. To do this, we developed a methodology to implement Design Patterns in Fortran90. Design Patterns are abstract solutions to generic programming problems, which allow one to handle increased complexity. This work was done in collaboration with Henry Gardner, a computer scientist (and former plasma physicist) from the Australian National University. As an example, the Strategy Pattern is being used in GTC to support multiple solvers. This new code is currently being used in the study of energetic particles. A document describing the evolution of the GTC code to this new object-oriented version is available to users of GTC.« less

  14. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  15. Comparison of Collaboration and Performance in Groups of Learners Assembled Randomly or Based on Learners' Topic Preferences

    ERIC Educational Resources Information Center

    Cela, Karina L.; Sicilia, Miguel Ángel; Sánchez, Salvador

    2015-01-01

    Teachers and instructional designers frequently incorporate collaborative learning approaches into their e-learning environments. A key factor of collaborative learning that may affect learner outcomes is whether the collaborative groups are assigned project topics randomly or based on a shared interest in the topic. This is a particularly…

  16. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NOAH > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code

  17. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > GEFS > IMPLEMENTATION SCHEDULLE Home Operational Products Experimental Data Verification Model Configuration Implementation Schedule Collaborators Documentation FAQ Code

  18. "Let's Set Up Some Subgoals": Understanding Human-Pedagogical Agent Collaborations and Their Implications for Learning and Prompt and Feedback Compliance

    ERIC Educational Resources Information Center

    Harley, Jason M.; Taub, Michelle; Azevedo, Roger; Bouchet, Francois

    2018-01-01

    Research on collaborative learning between humans and virtual pedagogical agents represents a necessary extension to recent research on the conceptual, theoretical, methodological, analytical, and educational issues behind co- and socially-shared regulated learning between humans. This study presents a novel coding framework that was developed and…

  19. Teacher Collaborative Inquiry as a Professional Development Intervention: Benefits and Challenges

    ERIC Educational Resources Information Center

    Deni, Ann Rosnida Md.; Malakolunthu, Suseela

    2013-01-01

    The paper reports on a collaborative learning project coded as the teacher inquiry community that was carried out over a year in a private higher education institution to improve the professional capability of language-based subject teachers. Nine teachers completed the project all of whom were females and shared work experience of 2-29 years. Six…

  20. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    NASA Astrophysics Data System (ADS)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the reverse engineering tools/APIs. We will also inform on collaborative efforts in the restructuring of the NOAA Environmental Modeling System (NEMS) - the multi- model and coupling framework, and transitioning FEE verification methodology.

  1. JASMIN: Japanese-American study of muon interactions and neutron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakashima, Hiroshi; /JAEA, Ibaraki; Mokhov, N.V.

    Experimental studies of shielding and radiation effects at Fermi National Accelerator Laboratory (FNAL) have been carried out under collaboration between FNAL and Japan, aiming at benchmarking of simulation codes and study of irradiation effects for upgrade and design of new high-energy accelerator facilities. The purposes of this collaboration are (1) acquisition of shielding data in a proton beam energy domain above 100GeV; (2) further evaluation of predictive accuracy of the PHITS and MARS codes; (3) modification of physics models and data in these codes if needed; (4) establishment of irradiation field for radiation effect tests; and (5) development of amore » code module for improved description of radiation effects. A series of experiments has been performed at the Pbar target station and NuMI facility, using irradiation of targets with 120 GeV protons for antiproton and neutrino production, as well as the M-test beam line (M-test) for measuring nuclear data and detector responses. Various nuclear and shielding data have been measured by activation methods with chemical separation techniques as well as by other detectors such as a Bonner ball counter. Analyses with the experimental data are in progress for benchmarking the PHITS and MARS15 codes. In this presentation recent activities and results are reviewed.« less

  2. Chromaticity calculations and code comparisons for x-ray lithography source XLS and SXLS rings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-06-16

    This note presents the chromaticity calculations and code comparison results for the (x-ray lithography source) XLS (Chasman Green, XUV Cosy lattice) and (2 magnet 4T) SXLS lattices, with the standard beam optic codes, including programs SYNCH88.5, MAD6, PATRICIA88.4, PATPET88.2, DIMAD, BETA, and MARYLIE. This analysis is a part of our ongoing accelerator physics code studies. 4 figs., 10 tabs.

  3. Comparison of two computer codes for crack growth analysis: NASCRAC Versus NASA/FLAGRO

    NASA Technical Reports Server (NTRS)

    Stallworth, R.; Meyers, C. A.; Stinson, H. C.

    1989-01-01

    Results are presented from the comparison study of two computer codes for crack growth analysis - NASCRAC and NASA/FLAGRO. The two computer codes gave compatible conservative results when the part through crack analysis solutions were analyzed versus experimental test data. Results showed good correlation between the codes for the through crack at a lug solution. For the through crack at a lug solution, NASA/FLAGRO gave the most conservative results.

  4. Accumulating pyramid spatial-spectral collaborative coding divergence for hyperspectral anomaly detection

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Zou, Huanxin; Zhou, Shilin

    2016-03-01

    Detection of anomalous targets of various sizes in hyperspectral data has received a lot of attention in reconnaissance and surveillance applications. Many anomaly detectors have been proposed in literature. However, current methods are susceptible to anomalies in the processing window range and often make critical assumptions about the distribution of the background data. Motivated by the fact that anomaly pixels are often distinctive from their local background, in this letter, we proposed a novel hyperspectral anomaly detection framework for real-time remote sensing applications. The proposed framework consists of four major components, sparse feature learning, pyramid grid window selection, joint spatial-spectral collaborative coding and multi-level divergence fusion. It exploits the collaborative representation difference in the feature space to locate potential anomalies and is totally unsupervised without any prior assumptions. Experimental results on airborne recorded hyperspectral data demonstrate that the proposed methods adaptive to anomalies in a large range of sizes and is well suited for parallel processing.

  5. Computerized Dental Comparison: A Critical Review of Dental Coding and Ranking Algorithms Used in Victim Identification.

    PubMed

    Adams, Bradley J; Aschheim, Kenneth W

    2016-01-01

    Comparison of antemortem and postmortem dental records is a leading method of victim identification, especially for incidents involving a large number of decedents. This process may be expedited with computer software that provides a ranked list of best possible matches. This study provides a comparison of the most commonly used conventional coding and sorting algorithms used in the United States (WinID3) with a simplified coding format that utilizes an optimized sorting algorithm. The simplified system consists of seven basic codes and utilizes an optimized algorithm based largely on the percentage of matches. To perform this research, a large reference database of approximately 50,000 antemortem and postmortem records was created. For most disaster scenarios, the proposed simplified codes, paired with the optimized algorithm, performed better than WinID3 which uses more complex codes. The detailed coding system does show better performance with extremely large numbers of records and/or significant body fragmentation. © 2015 American Academy of Forensic Sciences.

  6. Code-to-Code Comparison, and Material Response Modeling of Stardust and MSL using PATO and FIAT

    NASA Technical Reports Server (NTRS)

    Omidy, Ali D.; Panerai, Francesco; Martin, Alexandre; Lachaud, Jean R.; Cozmuta, Ioana; Mansour, Nagi N.

    2015-01-01

    This report provides a code-to-code comparison between PATO, a recently developed high fidelity material response code, and FIAT, NASA's legacy code for ablation response modeling. The goal is to demonstrates that FIAT and PATO generate the same results when using the same models. Test cases of increasing complexity are used, from both arc-jet testing and flight experiment. When using the exact same physical models, material properties and boundary conditions, the two codes give results that are within 2% of errors. The minor discrepancy is attributed to the inclusion of the gas phase heat capacity (cp) in the energy equation in PATO, and not in FIAT.

  7. [Medical Applications of the PHITS Code I: Recent Improvements and Biological Dose Estimation Model].

    PubMed

    Sato, Tatsuhiko; Furuta, Takuya; Hashimoto, Shintaro; Kuga, Naoya

    2015-01-01

    PHITS is a general purpose Monte Carlo particle transport simulation code developed through the collaboration of several institutes mainly in Japan. It can analyze the motion of nearly all radiations over wide energy ranges in 3-dimensional matters. It has been used for various applications including medical physics. This paper reviews the recent improvements of the code, together with the biological dose estimation method developed on the basis of the microdosimetric function implemented in PHITS.

  8. Fully-Implicit Navier-Stokes (FIN-S)

    NASA Technical Reports Server (NTRS)

    Kirk, Benjamin S.

    2010-01-01

    FIN-S is a SUPG finite element code for flow problems under active development at NASA Lyndon B. Johnson Space Center and within PECOS: a) The code is built on top of the libMesh parallel, adaptive finite element library. b) The initial implementation of the code targeted supersonic/hypersonic laminar calorically perfect gas flows & conjugate heat transfer. c) Initial extension to thermochemical nonequilibrium about 9 months ago. d) The technologies in FIN-S have been enhanced through a strongly collaborative research effort with Sandia National Labs.

  9. Improving Students' Summary Writing Ability through Collaboration: A Comparison between Online Wiki Group and Conventional Face-To-Face Group

    ERIC Educational Resources Information Center

    Wichadee, Saovapa

    2013-01-01

    Wikis, as one of the Web 2.0 social networking tools, have been increasingly integrated into second language (L2) instruction to promote collaborative writing. The current study examined and compared summary writing abilities between students learning by wiki-based collaboration and students learning by traditional face-to-face collaboration.…

  10. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  11. Planned Comparisons as Better Alternatives to ANOVA Omnibus Tests.

    ERIC Educational Resources Information Center

    Benton, Roberta L.

    Analyses of data are presented to illustrate the advantages of using a priori or planned comparisons rather than omnibus analysis of variance (ANOVA) tests followed by post hoc or posteriori testing. The two types of planned comparisons considered are planned orthogonal non-trend coding contrasts and orthogonal polynomial or trend contrast coding.…

  12. IAEA coordinated research project on thermal-hydraulics of Supercritical Water-Cooled Reactors (SCWRs)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamada, K.; Aksan, S. N.

    The Supercritical Water-Cooled Reactor (SCWR) is an innovative water-cooled reactor concept, which uses supercritical pressure water as reactor coolant. It has been attracting interest of many researchers in various countries mainly due to its benefits of high thermal efficiency and simple primary systems, resulting in low capital cost. The IAEA started in 2008 a Coordinated Research Project (CRP) on Thermal-Hydraulics of SCWRs as a forum to foster the exchange of technical information and international collaboration in research and development. This paper summarizes the activities and current status of the CRP, as well as major progress achieved to date. At present,more » 15 institutions closely collaborate in several tasks. Some organizations have been conducting thermal-hydraulics experiments and analysing the data, and others have been participating in code-to-test and/or code-to-code benchmark exercises. The expected outputs of the CRP are also discussed. Finally, the paper introduces several IAEA activities relating to or arising from the CRP. (authors)« less

  13. MOCCA code for star cluster simulation: comparison with optical observations using COCOA

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Olech, Arkadiusz; Hypki, Arkadiusz

    2016-02-01

    We introduce and present preliminary results from COCOA (Cluster simulatiOn Comparison with ObservAtions) code for a star cluster after 12 Gyr of evolution simulated using the MOCCA code. The COCOA code is being developed to quickly compare results of numerical simulations of star clusters with observational data. We use COCOA to obtain parameters of the projected cluster model. For comparison, a FITS file of the projected cluster was provided to observers so that they could use their observational methods and techniques to obtain cluster parameters. The results show that the similarity of cluster parameters obtained through numerical simulations and observations depends significantly on the quality of observational data and photometric accuracy.

  14. Preclinical Mouse Models of Neurofibromatosis

    DTIC Science & Technology

    2004-10-01

    collaborated closely and have shared expertise and reagents extensively. This NF Consortium is a member of the Moue Models of Human Cancer Consortium...of the National Cancer Institute and is participating fully in the activities of the group. The current award will support these collaborative...studies through 2005. 14. SUBJECT TERMS 15. NUMBER OF PAGES Neurofibromatosis, cancer , mouse models 48 16. PRICE CODE 17. SECURITY CLASSIFICATION 78

  15. One Size Does Not Fit All: A System Development Perspective

    DTIC Science & Technology

    2013-09-01

    study seeks an understanding of the nature and characteristics of failed IT projects . These failures...are in the context of a plethora of resources made available to the Coast Guard to ensure the success of its IT projects . This study is important...features are as follows: 1. Collaboration : Agile methods are highly collaborative inside and outside the development group . 2. Code review:

  16. MCNP and GADRAS Comparisons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klasky, Marc Louis; Myers, Steven Charles; James, Michael R.

    To facilitate the timely execution of System Threat Reviews (STRs) for DNDO, and also to develop a methodology for performing STRs, LANL performed comparisons of several radiation transport codes (MCNP, GADRAS, and Gamma-Designer) that have been previously utilized to compute radiation signatures. While each of these codes has strengths, it is of paramount interest to determine the limitations of each of the respective codes and also to identify the most time efficient means by which to produce computational results, given the large number of parametric cases that are anticipated in performing STR's. These comparisons serve to identify regions of applicabilitymore » for each code and provide estimates of uncertainty that may be anticipated. Furthermore, while performing these comparisons, examination of the sensitivity of the results to modeling assumptions was also examined. These investigations serve to enable the creation of the LANL methodology for performing STRs. Given the wide variety of radiation test sources, scenarios, and detectors, LANL calculated comparisons of the following parameters: decay data, multiplicity, device (n,γ) leakages, and radiation transport through representative scenes and shielding. This investigation was performed to understand potential limitations utilizing specific codes for different aspects of the STR challenges.« less

  17. International Collaborations on Engineered Barrier Systems: Brief Overview of SKB-EBS Activities.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jove-Colon, Carlos F.

    2015-10-01

    Research collaborations with international partners on the behavior and performance of engineered barrier systems (EBS) are an important aspect of the DOE-NE Used Fuel Disposition Campaign strategy in the evaluation of disposal design concepts. These international partnerships are a cost-effective way of engaging in key R&D activities with common goals resulting in effective scientific knowledge exchanges thus enhancing existing and future research programs in the USA. This report provides a brief description of the activities covered by the Swedish Nuclear Fuel and Waste Management Company (SKB) EBS Task Force (TF) (referred hereafter as SKB EBS TF) and potential future directionsmore » for engagement of the DOE-NE UFDC program in relevant R&D activities. Emphasis is given to SKB EBS TF activities that are still ongoing and aligned to the UFDC R&D program. This include utilization of data collected in the bentonite rock interaction experiment (BRIE) and data sets from benchmark experiments produced by the chemistry or “C” part of the SKB EBS TF. Potential applications of information generated by this program include comparisons/tests between model and data (e.g., reactive diffusion), development and implementation of coupled-process models (e.g., HM), and code/model benchmarking.« less

  18. Calculation of inviscid flow over shuttle-like vehicles at high angles of attack and comparisons with experimental data

    NASA Technical Reports Server (NTRS)

    Weilmuenster, K. J.; Hamilton, H. H., II

    1983-01-01

    A computer code HALIS, designed to compute the three dimensional flow about shuttle like configurations at angles of attack greater than 25 deg, is described. Results from HALIS are compared where possible with an existing flow field code; such comparisons show excellent agreement. Also, HALIS results are compared with experimental pressure distributions on shuttle models over a wide range of angle of attack. These comparisons are excellent. It is demonstrated that the HALIS code can incorporate equilibrium air chemistry in flow field computations.

  19. How social stigma sustains the HIV treatment gap for MSM in Mpumalanga, South Africa.

    PubMed

    Maleke, Kabelo; Daniels, Joseph; Lane, Tim; Struthers, Helen; McIntyre, James; Coates, Thomas

    2017-11-01

    There are gaps in HIV care for men who have sex with men (MSM) in African settings, and HIV social stigma plays a significant role in sustaining these gaps. We conducted a three-year research project with 49 HIV-positive MSM in two districts in Mpumalanga Province, South Africa, to understand the factors that inform HIV care seeking behaviors. Semi-structured focus group discussions and interviews were conducted in IsiZulu, SiSwati, and some code-switching into English, and these were audio-recorded, transcribed, and translated into English. We used a constant comparison approach to analyze these data. HIV social stigma centered around gossip that sustained self-diagnosis and delayed clinical care with decisions to use traditional healers to mitigate the impact of gossip on their lives. More collaboration models are needed between traditional healers and health professionals to support the global goals for HIV testing and treatment.

  20. Comparison of professional values of Taiwanese and United States nursing students.

    PubMed

    Alfred, Danita; Yarbrough, Susan; Martin, Pam; Mink, Janice; Lin, Yu-Hua; Wang, Liching S

    2013-12-01

    Globalization is a part of modern life. Sharing a common set of professional nursing values is critical in this global environment. The purpose of this research was to examine the professional values of nursing students from two distinct cultural perspectives. Nurse educators in Taiwan partnered with nurse educators in the United States to compare professional values of their respective graduating nursing students. The American Nurses Association Code of Ethics served as the philosophical framework for this examination. The convenience sample comprised 94 Taiwanese students and 168 US students. Both groups reported high scores on an overall measure of values. They did differ substantially on the relative importance of individual items related to advocacy, competence, education, self-evaluation, professional advancement, and professional associations. Global implications for the collaborative practice of nurses from different cultures working together can be improved by first recognizing and then attending to these differences in value priorities.

  1. Video watermarking for mobile phone applications

    NASA Astrophysics Data System (ADS)

    Mitrea, M.; Duta, S.; Petrescu, M.; Preteux, F.

    2005-08-01

    Nowadays, alongside with the traditional voice signal, music, video, and 3D characters tend to become common data to be run, stored and/or processed on mobile phones. Hence, to protect their related intellectual property rights also becomes a crucial issue. The video sequences involved in such applications are generally coded at very low bit rates. The present paper starts by presenting an accurate statistical investigation on such a video as well as on a very dangerous attack (the StirMark attack). The obtained results are turned into practice when adapting a spread spectrum watermarking method to such applications. The informed watermarking approach was also considered: an outstanding method belonging to this paradigm has been adapted and re evaluated under the low rate video constraint. The experimental results were conducted in collaboration with the SFR mobile services provider in France. They also allow a comparison between the spread spectrum and informed embedding techniques.

  2. Validation of the NCC Code for Staged Transverse Injection and Computations for a RBCC Combustor

    NASA Technical Reports Server (NTRS)

    Ajmani, Kumud; Liu, Nan-Suey

    2005-01-01

    The NCC code was validated for a case involving staged transverse injection into Mach 2 flow behind a rearward facing step. Comparisons with experimental data and with solutions from the FPVortex code was then used to perform computations to study fuel-air mixing for the combustor of a candidate rocket based combined cycle engine geometry. Comparisons with a one-dimensional analysis and a three-dimensional code (VULCAN) were performed to assess the qualitative and quantitative performance of the NCC solver.

  3. The MIMIC Code Repository: enabling reproducibility in critical care research.

    PubMed

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  4. Comparisons of time explicit hybrid kinetic-fluid code Architect for Plasma Wakefield Acceleration with a full PIC code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Massimo, F., E-mail: francesco.massimo@ensta-paristech.fr; Dipartimento SBAI, Università di Roma “La Sapienza“, Via A. Scarpa 14, 00161 Roma; Atzeni, S.

    Architect, a time explicit hybrid code designed to perform quick simulations for electron driven plasma wakefield acceleration, is described. In order to obtain beam quality acceptable for applications, control of the beam-plasma-dynamics is necessary. Particle in Cell (PIC) codes represent the state-of-the-art technique to investigate the underlying physics and possible experimental scenarios; however PIC codes demand the necessity of heavy computational resources. Architect code substantially reduces the need for computational resources by using a hybrid approach: relativistic electron bunches are treated kinetically as in a PIC code and the background plasma as a fluid. Cylindrical symmetry is assumed for themore » solution of the electromagnetic fields and fluid equations. In this paper both the underlying algorithms as well as a comparison with a fully three dimensional particle in cell code are reported. The comparison highlights the good agreement between the two models up to the weakly non-linear regimes. In highly non-linear regimes the two models only disagree in a localized region, where the plasma electrons expelled by the bunch close up at the end of the first plasma oscillation.« less

  5. OC5 Project Phase Ib: Validation of hydrodynamic loading on a fixed, flexible cylinder for offshore wind applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less

  6. Ocean power technology design optimization

    DOE PAGES

    van Rij, Jennifer; Yu, Yi -Hsiang; Edwards, Kathleen; ...

    2017-07-18

    For this study, the National Renewable Energy Laboratory and Ocean Power Technologies (OPT) conducted a collaborative code validation and design optimization study for OPT's PowerBuoy wave energy converter (WEC). NREL utilized WEC-Sim, an open-source WEC simulator, to compare four design variations of OPT's PowerBuoy. As an input to the WEC-Sim models, viscous drag coefficients for the PowerBuoy floats were first evaluated using computational fluid dynamics. The resulting WEC-Sim PowerBuoy models were then validated with experimental power output and fatigue load data provided by OPT. The validated WEC-Sim models were then used to simulate the power performance and loads for operationalmore » conditions, extreme conditions, and directional waves, for each of the four PowerBuoy design variations, assuming the wave environment of Humboldt Bay, California. And finally, ratios of power-to-weight, power-to-fatigue-load, power-to-maximum-extreme-load, power-to-water-plane-area, and power-to-wetted-surface-area were used to make a final comparison of the potential PowerBuoy WEC designs. Lastly, the design comparison methodologies developed and presented in this study are applicable to other WEC devices and may be useful as a framework for future WEC design development projects.« less

  7. OC5 Project Phase Ib: Validation of hydrodynamic loading on a fixed, flexible cylinder for offshore wind applications

    DOE PAGES

    Robertson, Amy N.; Wendt, Fabian; Jonkman, Jason M.; ...

    2016-10-13

    This paper summarizes the findings from Phase Ib of the Offshore Code Comparison, Collaboration, Continued with Correlation (OC5) project. OC5 is a project run under the International Energy Agency (IEA) Wind Research Task 30, and is focused on validating the tools used for modelling offshore wind systems through the comparison of simulated responses of select offshore wind systems (and components) to physical test data. For Phase Ib of the project, simulated hydrodynamic loads on a flexible cylinder fixed to a sloped bed were validated against test measurements made in the shallow water basin at the Danish Hydraulic Institute (DHI) withmore » support from the Technical University of Denmark (DTU). The first phase of OC5 examined two simple cylinder structures (Phase Ia and Ib) to focus on validation of hydrodynamic models used in the various tools before moving on to more complex offshore wind systems and the associated coupled physics. As a result, verification and validation activities such as these lead to improvement of offshore wind modelling tools, which will enable the development of more innovative and cost-effective offshore wind designs.« less

  8. Ocean power technology design optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    van Rij, Jennifer; Yu, Yi -Hsiang; Edwards, Kathleen

    For this study, the National Renewable Energy Laboratory and Ocean Power Technologies (OPT) conducted a collaborative code validation and design optimization study for OPT's PowerBuoy wave energy converter (WEC). NREL utilized WEC-Sim, an open-source WEC simulator, to compare four design variations of OPT's PowerBuoy. As an input to the WEC-Sim models, viscous drag coefficients for the PowerBuoy floats were first evaluated using computational fluid dynamics. The resulting WEC-Sim PowerBuoy models were then validated with experimental power output and fatigue load data provided by OPT. The validated WEC-Sim models were then used to simulate the power performance and loads for operationalmore » conditions, extreme conditions, and directional waves, for each of the four PowerBuoy design variations, assuming the wave environment of Humboldt Bay, California. And finally, ratios of power-to-weight, power-to-fatigue-load, power-to-maximum-extreme-load, power-to-water-plane-area, and power-to-wetted-surface-area were used to make a final comparison of the potential PowerBuoy WEC designs. Lastly, the design comparison methodologies developed and presented in this study are applicable to other WEC devices and may be useful as a framework for future WEC design development projects.« less

  9. Wind turbine design codes: A comparison of the structural response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buhl, M.L. Jr.; Wright, A.D.; Pierce, K.G.

    2000-03-01

    The National Wind Technology Center (NWTC) of the National Renewable Energy Laboratory is continuing a comparison of several computer codes used in the design and analysis of wind turbines. The second part of this comparison determined how well the programs predict the structural response of wind turbines. In this paper, the authors compare the structural response for four programs: ADAMS, BLADED, FAST{_}AD, and YawDyn. ADAMS is a commercial, multibody-dynamics code from Mechanical Dynamics, Inc. BLADED is a commercial, performance and structural-response code from Garrad Hassan and Partners Limited. FAST{_}AD is a structural-response code developed by Oregon State University and themore » University of Utah for the NWTC. YawDyn is a structural-response code developed by the University of Utah for the NWTC. ADAMS, FAST{_}AD, and YawDyn use the University of Utah's AeroDyn subroutine package for calculating aerodynamic forces. Although errors were found in all the codes during this study, once they were fixed, the codes agreed surprisingly well for most of the cases and configurations that were evaluated. One unresolved discrepancy between BLADED and the AeroDyn-based codes was when there was blade and/or teeter motion in addition to a large yaw error.« less

  10. Evaluation of social interaction, task management, and trust among dental hygiene students in a collaborative learning environment.

    PubMed

    Saylor, Catherine D; Keselyak, Nancy T; Simmer-Beck, Melanie; Tira, Daniel

    2011-02-01

    The purpose of this study was to evaluate the impact of collaborative learning on the development of social interaction, task management, and trust in dental hygiene students. These three traits were assessed with the Teamwork Assessment Scale in two different learning environments (traditional lecture/lab and collaborative learning environment). A convenience sample of fifty-six entry-level dental hygiene students taking an introductory/preclinic course at two metropolitan area dental hygiene programs provided comparable experimental and control groups. Factor scores were computed for the three traits, and comparisons were conducted using the Ryan-Einot-Gabriel-Welsh multiple comparison procedure among specific cell comparisons generated from a two-factor repeated measures ANOVA. The results indicate that the collaborative learning environment influenced dental hygiene students positively regarding the traits of social interaction, task management, and trust. However, comparing dental hygiene students to undergraduate students overall indicates that dental hygiene students already possess somewhat higher levels of these traits. Future studies on active learning strategies should examine factors such as student achievement and explore other possible active learning methodologies.

  11. Stakeholder perceptions of indicators of tourism use and codes of conduct in a coastal protected area in Alaska

    Treesearch

    Emily F. Pomeranz; Mark D. Needham; Linda E. Kruger

    2013-01-01

    This article focuses on a collaborative approach for addressing impacts of watercraft-based tourism in Tracy Arm-Fords Terror Wilderness, Alaska. This approach is the Wilderness Best Management Practices (WBMP) and involves codes of conduct for managing use in this area. This article examines use-related indicators that stakeholders prioritize for inclusion in the WBMP...

  12. Argumentation-Based Collaborative Inquiry in Science through Representational Work: Impact on Primary Students' Representational Fluency

    ERIC Educational Resources Information Center

    Nichols, Kim; Gillies, Robyn; Hedberg, John

    2016-01-01

    This study explored the impact of argumentation-promoting collaborative inquiry and representational work in science on primary students' representational fluency. Two hundred sixty-six year 6 students received instruction on natural disasters with a focus on collaborative inquiry. Students in the Comparison condition received only this…

  13. Comparison of 1:1 and 1:m CSCL Environment for Collaborative Concept Mapping

    ERIC Educational Resources Information Center

    Lin, C.-P.; Wong, L.-H.; Shao, Y.-J.

    2012-01-01

    This paper reports an investigation into the effects of collaborative concept mapping in a digital learning environment, in terms of students' overall learning gains, knowledge retention, quality of student artefacts (the collaboratively created concept maps), interactive patterns, and learning perceptions. Sixty-four 12-year-old students from two…

  14. An Urban Public School and University Collaboration: What Makes a PDS?

    ERIC Educational Resources Information Center

    Sosin, Adrienne; Parham, Ann

    This paper describes the status and development of a school/university partnership from the point of view of the participants. Descriptions of the paths collaboration has taken, anecdotal recall, and reflections about working toward a collaborative relationship support comparisons of this relationship with the Professional Development School (PDS)…

  15. Open NASA Earth Exchange (OpenNEX): Strategies for enabling cross organization collaboration in the earth sciences

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Ganguly, S.; Nemani, R. R.; Votava, P.; Wang, W.; Lee, T. J.; Dungan, J. L.

    2014-12-01

    Sharing community-valued codes, intermediary datasets and results from individual efforts with others that are not in a direct funded collaboration can be a challenge. Cross organization collaboration is often impeded due to infrastructure security constraints, rigid financial controls, bureaucracy, and workforce nationalities, etc., which can force groups to work in a segmented fashion and/or through awkward and suboptimal web services. We show how a focused community may come together, share modeling and analysis codes, computing configurations, scientific results, knowledge and expertise on a public cloud platform; diverse groups of researchers working together at "arms length". Through the OpenNEX experimental workshop, users can view short technical "how-to" videos and explore encapsulated working environment. Workshop participants can easily instantiate Amazon Machine Images (AMI) or launch full cluster and data processing configurations within minutes. Enabling users to instantiate computing environments from configuration templates on large public cloud infrastructures, such as Amazon Web Services, may provide a mechanism for groups to easily use each others work and collaborate indirectly. Moreover, using the public cloud for this workshop allowed a single group to host a large read only data archive, making datasets of interest to the community widely available on the public cloud, enabling other groups to directly connect to the data and reduce the costs of the collaborative work by freeing other individual groups from redundantly retrieving, integrating or financing the storage of the datasets of interest.

  16. Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled High-Resolution Gamma Spectrometry Systems Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dreyer, Jonathan G.; Wang, Tzu-Fang; Vo, Duc T.

    Under a 2006 agreement between the Department of Energy (DOE) of the United States of America and the Institut de Radioprotection et de Sûreté Nucléaire (IRSN) of France, the National Nuclear Security Administration (NNSA) within DOE and IRSN initiated a collaboration to improve isotopic identification and analysis of nuclear material [i.e., plutonium (Pu) and uranium (U)]. The specific aim of the collaborative project was to develop new versions of two types of isotopic identification and analysis software: (1) the fixed-energy response-function analysis for multiple energies (FRAM) codes and (2) multi-group analysis (MGA) codes. The project is entitled Action Sheet 4more » – Cooperation on Improved Isotopic Identification and Analysis Software for Portable, Electrically Cooled, High-Resolution Gamma Spectrometry Systems (Action Sheet 4). FRAM and MGA/U235HI are software codes used to analyze isotopic ratios of U and Pu. FRAM is an application that uses parameter sets for the analysis of U or Pu. MGA and U235HI are two separate applications that analyze Pu or U, respectively. They have traditionally been used by safeguards practitioners to analyze gamma spectra acquired with high-resolution gamma spectrometry (HRGS) systems that are cooled by liquid nitrogen. However, it was discovered that these analysis programs were not as accurate when used on spectra acquired with a newer generation of more portable, electrically cooled HRGS (ECHRGS) systems. In response to this need, DOE/NNSA and IRSN collaborated to update the FRAM and U235HI codes to improve their performance with newer ECHRGS systems. Lawrence Livermore National Laboratory (LLNL) and Los Alamos National Laboratory (LANL) performed this work for DOE/NNSA.« less

  17. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data.

    PubMed

    Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.

  18. Comparison of Measured and Block Structured Simulations for the F-16XL Aircraft

    NASA Technical Reports Server (NTRS)

    Boelens, O. J.; Badcock, K. J.; Elmilgui, A.; Abdol-Hamid, K. S.; Massey, S. J.

    2008-01-01

    This article presents a comparison of the predictions of three RANS codes for flight conditions of the F-16XL aircraft which feature vortical flow. The three codes, ENSOLV, PMB and PAB3D, solve on structured multi-block grids. Flight data for comparison was available in the form of surface pressures, skin friction, boundary layer data and photographs of tufts. The three codes provided predictions which were consistent with expectations based on the turbulence modelling used, which was k- , k- with vortex corrections and an Algebraic Stress Model. The agreement with flight data was good, with the exception of the outer wing primary vortex strength. The confidence in the application of the CFD codes to complex fighter configurations increased significantly through this study.

  19. SeisCode: A seismological software repository for discovery and collaboration

    NASA Astrophysics Data System (ADS)

    Trabant, C.; Reyes, C. G.; Clark, A.; Karstens, R.

    2012-12-01

    SeisCode is a community repository for software used in seismological and related fields. The repository is intended to increase discoverability of such software and to provide a long-term home for software projects. Other places exist where seismological software may be found, but none meet the requirements necessary for an always current, easy to search, well documented, and citable resource for projects. Organizations such as IRIS, ORFEUS, and the USGS have websites with lists of available or contributed seismological software. Since the authors themselves do often not maintain these lists, the documentation often consists of a sentence or paragraph, and the available software may be outdated. Repositories such as GoogleCode and SourceForge, which are directly maintained by the authors, provide version control and issue tracking but do not provide a unified way of locating geophysical software scattered in and among countless unrelated projects. Additionally, projects are hosted at language-specific sites such as Mathworks and PyPI, in FTP directories, and in websites strewn across the Web. Search engines are only partially effective discovery tools, as the desired software is often hidden deep within the results. SeisCode provides software authors a place to present their software, codes, scripts, tutorials, and examples to the seismological community. Authors can choose their own level of involvement. At one end of the spectrum, the author might simply create a web page that points to an existing site. At the other extreme, an author may choose to leverage the many tools provided by SeisCode, such as a source code management tool with integrated issue tracking, forums, news feeds, downloads, wikis, and more. For software development projects with multiple authors, SeisCode can also be used as a central site for collaboration. SeisCode provides the community with an easy way to discover software, while providing authors a way to build a community around their software packages. IRIS invites the seismological community to browse and to submit projects to https://seiscode.iris.washington.edu/

  20. Jet and electromagnetic tomography (JET) of extreme phases of matter in heavy-ion collisions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinz, Ulrich

    2015-08-31

    The Ohio State University (OSU) group contributed to the deliverables of the JET Collaboration three major products: 1. The code package iEBE-VISHNU for modeling the dynamical evolution of the soft medium created in relativistic heavy-ion collisions, from its creation all the way to final freeze-out using a hybrid approach that interfaces a free-streaming partonic pre-equilbrium stage with a (2+1)-dimensional viscous relativistic fluid dynamical stage for the quark-gluon plasma (QGP) phase and the microscopic hadron cascade UrQMD for the hadronic rescattering and freeze-out stage. Except for UrQMD, all dynamical evolution components and interfaces were developed at OSU and tested and implementedmore » in collaboration with the Duke University group. 2. An electromagnetic radiation module for the calculation of thermal photon emission from the QGP and hadron resonance gas stages of a heavy-ion collision, with emission rates that have been corrected for viscous effects in the expanding medium consistent with the bulk evolution. The electromagnetic radiation module was developed under OSU leadership in collaboration with the McGill group and has been integrated in the iEBE-VISHNU code package. 3. An interface between the Monte Carlo jet shower evolution and hadronization codes developed by the Wayne State University (WSU), McGill and Texas A&M groups and the iEBE-VISHNU bulk evolution code, for performing jet quenching and jet shape modification studies in a realistically modeled evolving medium that was tuned to measured soft hadron data. Building on work performed at OSU for the theoretical framework used to describe the interaction of jets with the medium, initial work on the jet shower Monte Carlo was started at OSU and moved to WSU when OSU Visiting Assistant Professor Abhijit Majumder accepted a tenure track faculty position at WSU in September 2011. The jet-hydro interface was developed at OSU and WSU and tested and implemented in collaboration with the McGill, Texas A&M, and LBNL groups.« less

  1. The trellis complexity of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Lin, W.

    1995-01-01

    It has long been known that convolutional codes have a natural, regular trellis structure that facilitates the implementation of Viterbi's algorithm. It has gradually become apparent that linear block codes also have a natural, though not in general a regular, 'minimal' trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of the Viterbi decoding algorithm can be accurately estimated by the number of trellis edges per encoded bit. It would, therefore, appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations that are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the minimal trellis representation. Thus, ironically, at present we seem to know more about the minimal trellis representation for block than for convolutional codes. In this article, we provide a remedy, by developing a theory of minimal trellises for convolutional codes. (A similar theory has recently been given by Sidorenko and Zyablov). This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-minimal generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that, in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.

  2. The Comparison of Solitary and Collaborative Modes of Game-Based Learning on Students' Science Learning and Motivation

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Wang, Kuan-Chieh; Lin, Yu-Hsuan

    2015-01-01

    In this study, we investigated and compared solitary and collaborative modes of game-based learning in promoting students' science learning and motivation. A total of fifty seventh grade students participated in this study. The results showed that students who played in a solitary or collaborative mode demonstrated improvement in learning…

  3. Vocabulary Learning in Collaborative Tasks: A Comparison of Pair and Small Group Work

    ERIC Educational Resources Information Center

    Dobao, Ana Fernández

    2014-01-01

    This study examined the opportunities that pair and small group interaction offer for collaborative dialogue and second language (L2) vocabulary learning. It compared the performance of the same collaborative writing task by learners working in groups of four (n = 60) and in pairs (n = 50), focusing on the occurrence of lexical language-related…

  4. A numerical similarity approach for using retired Current Procedural Terminology (CPT) codes for electronic phenotyping in the Scalable Collaborative Infrastructure for a Learning Health System (SCILHS).

    PubMed

    Klann, Jeffrey G; Phillips, Lori C; Turchin, Alexander; Weiler, Sarah; Mandl, Kenneth D; Murphy, Shawn N

    2015-12-11

    Interoperable phenotyping algorithms, needed to identify patient cohorts meeting eligibility criteria for observational studies or clinical trials, require medical data in a consistent structured, coded format. Data heterogeneity limits such algorithms' applicability. Existing approaches are often: not widely interoperable; or, have low sensitivity due to reliance on the lowest common denominator (ICD-9 diagnoses). In the Scalable Collaborative Infrastructure for a Learning Healthcare System (SCILHS) we endeavor to use the widely-available Current Procedural Terminology (CPT) procedure codes with ICD-9. Unfortunately, CPT changes drastically year-to-year - codes are retired/replaced. Longitudinal analysis requires grouping retired and current codes. BioPortal provides a navigable CPT hierarchy, which we imported into the Informatics for Integrating Biology and the Bedside (i2b2) data warehouse and analytics platform. However, this hierarchy does not include retired codes. We compared BioPortal's 2014AA CPT hierarchy with Partners Healthcare's SCILHS datamart, comprising three-million patients' data over 15 years. 573 CPT codes were not present in 2014AA (6.5 million occurrences). No existing terminology provided hierarchical linkages for these missing codes, so we developed a method that automatically places missing codes in the most specific "grouper" category, using the numerical similarity of CPT codes. Two informaticians reviewed the results. We incorporated the final table into our i2b2 SCILHS/PCORnet ontology, deployed it at seven sites, and performed a gap analysis and an evaluation against several phenotyping algorithms. The reviewers found the method placed the code correctly with 97 % precision when considering only miscategorizations ("correctness precision") and 52 % precision using a gold-standard of optimal placement ("optimality precision"). High correctness precision meant that codes were placed in a reasonable hierarchal position that a reviewer can quickly validate. Lower optimality precision meant that codes were not often placed in the optimal hierarchical subfolder. The seven sites encountered few occurrences of codes outside our ontology, 93 % of which comprised just four codes. Our hierarchical approach correctly grouped retired and non-retired codes in most cases and extended the temporal reach of several important phenotyping algorithms. We developed a simple, easily-validated, automated method to place retired CPT codes into the BioPortal CPT hierarchy. This complements existing hierarchical terminologies, which do not include retired codes. The approach's utility is confirmed by the high correctness precision and successful grouping of retired with non-retired codes.

  5. An Efficient Method for Verifying Gyrokinetic Microstability Codes

    NASA Astrophysics Data System (ADS)

    Bravenec, R.; Candy, J.; Dorland, W.; Holland, C.

    2009-11-01

    Benchmarks for gyrokinetic microstability codes can be developed through successful ``apples-to-apples'' comparisons among them. Unlike previous efforts, we perform the comparisons for actual discharges, rendering the verification efforts relevant to existing experiments and future devices (ITER). The process requires i) assembling the experimental analyses at multiple times, radii, discharges, and devices, ii) creating the input files ensuring that the input parameters are faithfully translated code-to-code, iii) running the codes, and iv) comparing the results, all in an organized fashion. The purpose of this work is to automate this process as much as possible: At present, a python routine is used to generate and organize GYRO input files from TRANSP or ONETWO analyses. Another routine translates the GYRO input files into GS2 input files. (Translation software for other codes has not yet been written.) Other python codes submit the multiple GYRO and GS2 jobs, organize the results, and collect them into a table suitable for plotting. (These separate python routines could easily be consolidated.) An example of the process -- a linear comparison between GYRO and GS2 for a DIII-D discharge at multiple radii -- will be presented.

  6. Comparison of DAC and MONACO DSMC Codes with Flat Plate Simulation

    NASA Technical Reports Server (NTRS)

    Padilla, Jose F.

    2010-01-01

    Various implementations of the direct simulation Monte Carlo (DSMC) method exist in academia, government and industry. By comparing implementations, deficiencies and merits of each can be discovered. This document reports comparisons between DSMC Analysis Code (DAC) and MONACO. DAC is NASA's standard DSMC production code and MONACO is a research DSMC code developed in academia. These codes have various differences; in particular, they employ distinct computational grid definitions. In this study, DAC and MONACO are compared by having each simulate a blunted flat plate wind tunnel test, using an identical volume mesh. Simulation expense and DSMC metrics are compared. In addition, flow results are compared with available laboratory data. Overall, this study revealed that both codes, excluding grid adaptation, performed similarly. For parallel processing, DAC was generally more efficient. As expected, code accuracy was mainly dependent on physical models employed.

  7. Off-Range Beaked Whale Studies (ORBS): Baseline Data and Tagging Development for Northern Bottlenose Whales (Hyperoodon ampulatus) off Jan Mayen, Norway

    DTIC Science & Technology

    2015-09-30

    02.003’N, 07°01.981’W) To be recovered in 2016 Ranging code #08D1; releasing code #0803 In collaboration with Rune Hansen of the University of...the animal with PTT 134760 was tracked moving all the way south to the Azores Archipelago. Figure courtesy of Rune Hansen. Objective 4. conduct

  8. Whole Device Modeling of Compact Tori: Stability and Transport Modeling of C-2W

    NASA Astrophysics Data System (ADS)

    Dettrick, Sean; Fulton, Daniel; Lau, Calvin; Lin, Zhihong; Ceccherini, Francesco; Galeotti, Laura; Gupta, Sangeeta; Onofri, Marco; Tajima, Toshiki; TAE Team

    2017-10-01

    Recent experimental evidence from the C-2U FRC experiment shows that the confinement of energy improves with inverse collisionality, similar to other high beta toroidal devices, NSTX and MAST. This motivated the construction of a new FRC experiment, C-2W, to study the energy confinement scaling at higher electron temperature. Tri Alpha Energy is working towards catalysing a community-wide collaboration to develop a Whole Device Model (WDM) of Compact Tori. One application of the WDM is the study of stability and transport properties of C-2W using two particle-in-cell codes, ANC and FPIC. These codes can be used to find new stable operating points, and to make predictions of the turbulent transport at those points. They will be used in collaboration with the C-2W experimental program to validate the codes against C-2W, mitigate experimental risk inherent in the exploration of new parameter regimes, accelerate the optimization of experimental operating scenarios, and to find operating points for future FRC reactor designs.

  9. FY2017 Updates to the SAS4A/SASSYS-1 Safety Analysis Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fanning, T. H.

    The SAS4A/SASSYS-1 safety analysis software is used to perform deterministic analysis of anticipated events as well as design-basis and beyond-design-basis accidents for advanced fast reactors. It plays a central role in the analysis of U.S. DOE conceptual designs, proposed test and demonstration reactors, and in domestic and international collaborations. This report summarizes the code development activities that have taken place during FY2017. Extensions to the void and cladding reactivity feedback models have been implemented, and Control System capabilities have been improved through a new virtual data acquisition system for plant state variables and an additional Block Signal for a variablemore » lag compensator to represent reactivity feedback for novel shutdown devices. Current code development and maintenance needs are also summarized in three key areas: software quality assurance, modeling improvements, and maintenance of related tools. With ongoing support, SAS4A/SASSYS-1 can continue to fulfill its growing role in fast reactor safety analysis and help solidify DOE’s leadership role in fast reactor safety both domestically and in international collaborations.« less

  10. Comparisons of 'Identical' Simulations by the Eulerian Gyrokinetic Codes GS2 and GYRO

    NASA Astrophysics Data System (ADS)

    Bravenec, R. V.; Ross, D. W.; Candy, J.; Dorland, W.; McKee, G. R.

    2003-10-01

    A major goal of the fusion program is to be able to predict tokamak transport from first-principles theory. To this end, the Eulerian gyrokinetic code GS2 was developed years ago and continues to be improved [1]. Recently, the Eulerian code GYRO was developed [2]. These codes are not subject to the statistical noise inherent to particle-in-cell (PIC) codes, and have been very successful in treating electromagnetic fluctuations. GS2 is fully spectral in the radial coordinate while GYRO uses finite-differences and ``banded" spectral schemes. To gain confidence in nonlinear simulations of experiment with these codes, ``apples-to-apples" comparisons (identical profile inputs, flux-tube geometry, two species, etc.) are first performed. We report on a series of linear and nonlinear comparisons (with overall agreement) including kinetic electrons, collisions, and shaped flux surfaces. We also compare nonlinear simulations of a DIII-D discharge to measurements of not only the fluxes but also the turbulence parameters. [1] F. Jenko, et al., Phys. Plasmas 7, 1904 (2000) and refs. therein. [2] J. Candy, J. Comput. Phys. 186, 545 (2003).

  11. FDA adverse Event Problem Codes: standardizing the classification of device and patient problems associated with medical device use.

    PubMed

    Reed, Terrie L; Kaufman-Rivi, Diana

    2010-01-01

    The broad array of medical devices and the potential for device failures, malfunctions, and other adverse events associated with each device creates a challenge for public health device surveillance programs. Coding reported events by type of device problem provides one method for identifying a potential signal of a larger device issue. The Food and Drug Administration's (FDA) Center for Devices and Radiological Health (CDRH) Event Problem Codes that are used to report adverse events previously lacked a structured set of controls for code development and maintenance. Over time this led to inconsistent, ambiguous, and duplicative concepts being added to the code set on an ad-hoc basis. Recognizing the limitation of its coding system the FDA set out to update the system to improve its usefulness within FDA and as a basis of a global standard to identify important patient and device outcomes throughout the medical community. In 2004, FDA and the National Cancer Institute (NCI) signed a Memorandum of Understanding (MOU) whereby NCI agreed to provide terminology development and maintenance services to all FDA Centers. Under this MOU, CDRH's Office of Surveillance and Biometrics (OSB) convened a cross-Center workgroup and collaborated with staff at NCI Enterprise Vocabulary Service (EVS) to streamline the Patient and Device Problem Codes and integrate them into the NCI Thesaurus and Meta-Thesaurus. This initiative included many enhancements to the Event Problem Codes aimed at improving code selection as well as improving adverse event report analysis. LIMITATIONS & RECOMMENDATIONS: Staff resources, database concerns, and limited collaboration with external groups in the initial phases of the project are discussed. Adverse events associated with medical device use can be better understood when they are reported using a consistent and well-defined code set. This FDA initiative was an attempt to improve the structure and add control mechanisms to an existing code set, improve analysis tools that will better identify device safety trends, and improve the ability to prevent or mitigate effects of adverse events associated with medical device use.

  12. A validation of LTRAN2 with high frequency extensions by comparisons with experimental measurements of unsteady transonic flows

    NASA Technical Reports Server (NTRS)

    Hessenius, K. A.; Goorjian, P. M.

    1981-01-01

    A high frequency extension of the unsteady, transonic code LTRAN2 was created and is evaluated by comparisons with experimental results. The experimental test case is a NACA 64A010 airfoil in pitching motion at a Mach number of 0.8 over a range of reduced frequencies. Comparisons indicate that the modified code is an improvement of the original LTRAN2 and provides closer agreement with experimental lift and moment coefficients. A discussion of the code modifications, which involve the addition of high frequency terms of the boundary conditions of the numerical algorithm, is included.

  13. Comparison of two- and three-dimensional flow computations with laser anemometer measurements in a transonic compressor rotor

    NASA Technical Reports Server (NTRS)

    Chima, R. V.; Strazisar, A. J.

    1982-01-01

    Two and three dimensional inviscid solutions for the flow in a transonic axial compressor rotor at design speed are compared with probe and laser anemometers measurements at near-stall and maximum-flow operating points. Experimental details of the laser anemometer system and computational details of the two dimensional axisymmetric code and three dimensional Euler code are described. Comparisons are made between relative Mach number and flow angle contours, shock location, and shock strength. A procedure for using an efficient axisymmetric code to generate downstream pressure input for computationally expensive Euler codes is discussed. A film supplement shows the calculations of the two operating points with the time-marching Euler code.

  14. Assessment and Mitigation of Radiation, EMP, Debris & Shrapnel Impacts at Megajoule-Class Laser Facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eder, D C; Anderson, R W; Bailey, D S

    2009-10-05

    The generation of neutron/gamma radiation, electromagnetic pulses (EMP), debris and shrapnel at mega-Joule class laser facilities (NIF and LMJ) impacts experiments conducted at these facilities. The complex 3D numerical codes used to assess these impacts range from an established code that required minor modifications (MCNP - calculates neutron and gamma radiation levels in complex geometries), through a code that required significant modifications to treat new phenomena (EMSolve - calculates EMP from electrons escaping from laser targets), to a new code, ALE-AMR, that is being developed through a joint collaboration between LLNL, CEA, and UC (UCSD, UCLA, and LBL) for debrismore » and shrapnel modelling.« less

  15. There is no MacWilliams identity for convolutional codes. [transmission gain comparison

    NASA Technical Reports Server (NTRS)

    Shearer, J. B.; Mceliece, R. J.

    1977-01-01

    An example is provided of two convolutional codes that have the same transmission gain but whose dual codes do not. This shows that no analog of the MacWilliams identity for block codes can exist relating the transmission gains of a convolutional code and its dual.

  16. Error control techniques for satellite and space communications

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.

    1991-01-01

    Shannon's capacity bound shows that coding can achieve large reductions in the required signal to noise ratio per information bit (E sub b/N sub 0 where E sub b is the energy per bit and (N sub 0)/2 is the double sided noise density) in comparison to uncoded schemes. For bandwidth efficiencies of 2 bit/sym or greater, these improvements were obtained through the use of Trellis Coded Modulation and Block Coded Modulation. A method of obtaining these high efficiencies using multidimensional Multiple Phase Shift Keying (MPSK) and Quadrature Amplitude Modulation (QAM) signal sets with trellis coding is described. These schemes have advantages in decoding speed, phase transparency, and coding gain in comparison to other trellis coding schemes. Finally, a general parity check equation for rotationally invariant trellis codes is introduced from which non-linear codes for two dimensional MPSK and QAM signal sets are found. These codes are fully transparent to all rotations of the signal set.

  17. PFLOTRAN-RepoTREND Source Term Comparison Summary.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frederick, Jennifer M.

    Code inter-comparison studies are useful exercises to verify and benchmark independently developed software to ensure proper function, especially when the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment. This summary describes the results of the first portion of the code inter-comparison between PFLOTRAN and RepoTREND, which compares the radionuclide source term used in a typical performance assessment.

  18. Collaborative Data Mining

    NASA Astrophysics Data System (ADS)

    Moyle, Steve

    Collaborative Data Mining is a setting where the Data Mining effort is distributed to multiple collaborating agents - human or software. The objective of the collaborative Data Mining effort is to produce solutions to the tackled Data Mining problem which are considered better by some metric, with respect to those solutions that would have been achieved by individual, non-collaborating agents. The solutions require evaluation, comparison, and approaches for combination. Collaboration requires communication, and implies some form of community. The human form of collaboration is a social task. Organizing communities in an effective manner is non-trivial and often requires well defined roles and processes. Data Mining, too, benefits from a standard process. This chapter explores the standard Data Mining process CRISP-DM utilized in a collaborative setting.

  19. A Stigmergy Approach for Open Source Software Developer Community Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Xiaohui; Beaver, Justin M; Potok, Thomas E

    2009-01-01

    The stigmergy collaboration approach provides a hypothesized explanation about how online groups work together. In this research, we presented a stigmergy approach for building an agent based open source software (OSS) developer community collaboration simulation. We used group of actors who collaborate on OSS projects as our frame of reference and investigated how the choices actors make in contribution their work on the projects determinate the global status of the whole OSS projects. In our simulation, the forum posts and project codes served as the digital pheromone and the modified Pierre-Paul Grasse pheromone model is used for computing developer agentmore » behaviors selection probability.« less

  20. Enabling a Scientific Cloud Marketplace: VGL (Invited)

    NASA Astrophysics Data System (ADS)

    Fraser, R.; Woodcock, R.; Wyborn, L. A.; Vote, J.; Rankine, T.; Cox, S. J.

    2013-12-01

    The Virtual Geophysics Laboratory (VGL) provides a flexible, web based environment where researchers can browse data and use a variety of scientific software packaged into tool kits that run in the Cloud. Both data and tool kits are published by multiple researchers and registered with the VGL infrastructure forming a data and application marketplace. The VGL provides the basic work flow of Discovery and Access to the disparate data sources and a Library for tool kits and scripting to drive the scientific codes. Computation is then performed on the Research or Commercial Clouds. Provenance information is collected throughout the work flow and can be published alongside the results allowing for experiment comparison and sharing with other researchers. VGL's "mix and match" approach to data, computational resources and scientific codes, enables a dynamic approach to scientific collaboration. VGL allows scientists to publish their specific contribution, be it data, code, compute or work flow, knowing the VGL framework will provide other components needed for a complete application. Other scientists can choose the pieces that suit them best to assemble an experiment. The coarse grain workflow of the VGL framework combined with the flexibility of the scripting library and computational toolkits allows for significant customisation and sharing amongst the community. The VGL utilises the cloud computational and storage resources from the Australian academic research cloud provided by the NeCTAR initiative and a large variety of data accessible from national and state agencies via the Spatial Information Services Stack (SISS - http://siss.auscope.org). VGL v1.2 screenshot - http://vgl.auscope.org

  1. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  2. Digital video technologies and their network requirements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. P. Tsang; H. Y. Chen; J. M. Brandt

    1999-11-01

    Coded digital video signals are considered to be one of the most difficult data types to transport due to their real-time requirements and high bit rate variability. In this study, the authors discuss the coding mechanisms incorporated by the major compression standards bodies, i.e., JPEG and MPEG, as well as more advanced coding mechanisms such as wavelet and fractal techniques. The relationship between the applications which use these coding schemes and their network requirements are the major focus of this study. Specifically, the authors relate network latency, channel transmission reliability, random access speed, buffering and network bandwidth with the variousmore » coding techniques as a function of the applications which use them. Such applications include High-Definition Television, Video Conferencing, Computer-Supported Collaborative Work (CSCW), and Medical Imaging.« less

  3. International collaboration in medical radiation science.

    PubMed

    Denham, Gary; Allen, Carla; Platt, Jane

    2016-06-01

    International collaboration is recognised for enhancing the ability to approach complex problems from a variety of perspectives, increasing development of a wider range of research skills and techniques and improving publication and acceptance rates. The aim of this paper is to describe the current status of international collaboration in medical radiation science and compare this to other allied health occupations. This study utilised a content analysis approach where co-authorship of a journal article was used as a proxy for research collaboration and the papers were assigned to countries based on the corporate address given in the by-line of the publication. A convenience sample method was employed and articles published in the professional medical radiation science journals in the countries represented within our research team - Australia, the United Kingdom (UK) and the United States of America (USA) were sampled. Physiotherapy, speech pathology, occupational therapy and nursing were chosen for comparison. Rates of international collaboration in medical radiation science journals from Australia, the UK and the USA have steadily increased over the 3-year period sampled. Medical radiation science demonstrated lower average rates of international collaboration than the other allied health occupations sampled. The average rate of international collaboration in nursing was far below that of the allied health occupations sampled. Overall, the UK had the highest average rate of international collaboration, followed by Australia and the USA, the lowest. Overall, medical radiation science is lagging in international collaboration in comparison to other allied health fields.

  4. Validation of the analytical methods in the LWR code BOXER for gadolinium-loaded fuel pins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paratte, J.M.; Arkuszewski, J.J.; Kamboj, B.K.

    1990-01-01

    Due to the very high absorption occurring in gadolinium-loaded fuel pins, calculations of lattices with such pins present are a demanding test of the analysis methods in light water reactor (LWR) cell and assembly codes. Considerable effort has, therefore, been devoted to the validation of code methods for gadolinia fuel. The goal of the work reported in this paper is to check the analysis methods in the LWR cell/assembly code BOXER and its associated cross-section processing code ETOBOX, by comparison of BOXER results with those from a very accurate Monte Carlo calculation for a gadolinium benchmark problem. Initial results ofmore » such a comparison have been previously reported. However, the Monte Carlo calculations, done with the MCNP code, were performed at Los Alamos National Laboratory using ENDF/B-V data, while the BOXER calculations were performed at the Paul Scherrer Institute using JEF-1 nuclear data. This difference in the basic nuclear data used for the two calculations, caused by the restricted nature of these evaluated data files, led to associated uncertainties in a comparison of the results for methods validation. In the joint investigations at the Georgia Institute of Technology and PSI, such uncertainty in this comparison was eliminated by using ENDF/B-V data for BOXER calculations at Georgia Tech.« less

  5. City Reach Code Technical Support Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athalye, Rahul A.; Chen, Yan; Zhang, Jian

    This report describes and analyzes a set of energy efficiency measures that will save 20% energy over ASHRAE Standard 90.1-2013. The measures will be used to formulate a Reach Code for cities aiming to go beyond national model energy codes. A coalition of U.S. cities together with other stakeholders wanted to facilitate the development of voluntary guidelines and standards that can be implemented in stages at the city level to improve building energy efficiency. The coalition's efforts are being supported by the U.S. Department of Energy via Pacific Northwest National Laboratory (PNNL) and in collaboration with the New Buildings Institute.

  6. Flowgen: Flowchart-based documentation for C + + codes

    NASA Astrophysics Data System (ADS)

    Kosower, David A.; Lopez-Villarejo, J. J.

    2015-11-01

    We present the Flowgen tool, which generates flowcharts from annotated C + + source code. The tool generates a set of interconnected high-level UML activity diagrams, one for each function or method in the C + + sources. It provides a simple and visual overview of complex implementations of numerical algorithms. Flowgen is complementary to the widely-used Doxygen documentation tool. The ultimate aim is to render complex C + + computer codes accessible, and to enhance collaboration between programmers and algorithm or science specialists. We describe the tool and a proof-of-concept application to the VINCIA plug-in for simulating collisions at CERN's Large Hadron Collider.

  7. NPS-NRL-Rice-UIUC Collaboration on Navy Atmosphere-Ocean Coupled Models on Many-Core Computer Architectures Annual Report

    DTIC Science & Technology

    2015-09-30

    DISTRIBUTION STATEMENT A: Distribution approved for public release; distribution is unlimited. NPS-NRL- Rice -UIUC Collaboration on Navy Atmosphere...portability. There is still a gap in the OCCA support for Fortran programmers who do not have accelerator experience. Activities at Rice /Virginia Tech are...for automated data movement and for kernel optimization using source code analysis and run-time detective work. In this quarter the Rice /Virginia

  8. Problem-Solving Phase Transitions During Team Collaboration.

    PubMed

    Wiltshire, Travis J; Butner, Jonathan E; Fiore, Stephen M

    2018-01-01

    Multiple theories of problem-solving hypothesize that there are distinct qualitative phases exhibited during effective problem-solving. However, limited research has attempted to identify when transitions between phases occur. We integrate theory on collaborative problem-solving (CPS) with dynamical systems theory suggesting that when a system is undergoing a phase transition it should exhibit a peak in entropy and that entropy levels should also relate to team performance. Communications from 40 teams that collaborated on a complex problem were coded for occurrence of problem-solving processes. We applied a sliding window entropy technique to each team's communications and specified criteria for (a) identifying data points that qualify as peaks and (b) determining which peaks were robust. We used multilevel modeling, and provide a qualitative example, to evaluate whether phases exhibit distinct distributions of communication processes. We also tested whether there was a relationship between entropy values at transition points and CPS performance. We found that a proportion of entropy peaks was robust and that the relative occurrence of communication codes varied significantly across phases. Peaks in entropy thus corresponded to qualitative shifts in teams' CPS communications, providing empirical evidence that teams exhibit phase transitions during CPS. Also, lower average levels of entropy at the phase transition points predicted better CPS performance. We specify future directions to improve understanding of phase transitions during CPS, and collaborative cognition, more broadly. Copyright © 2017 Cognitive Science Society, Inc.

  9. The influence of power dynamics and trust on multidisciplinary collaboration: a qualitative case study of type 2 diabetes mellitus

    PubMed Central

    2012-01-01

    Background Ongoing care for chronic conditions such as diabetes is best provided by a range of health professionals working together. There are challenges in achieving this where collaboration crosses organisational and sector boundaries. The aim of this article is to explore the influence of power dynamics and trust on collaboration between health professionals involved in the management of diabetes and their impact on patient experiences. Methods A qualitative case study conducted in a rural city in Australia. Forty five health service providers from nineteen organisations (including fee-for-service practices and block funded public sector services) and eight patients from two services were purposively recruited. Data was collected through semi-structured interviews that were audio-taped and transcribed. A thematic analysis approach was used using a two-level coding scheme and cross-case comparisons. Results Three themes emerged in relation to power dynamics between health professionals: their use of power to protect their autonomy, power dynamics between private and public sector providers, and reducing their dependency on other health professionals to maintain their power. Despite the intention of government policies to support more shared decision-making, there is little evidence that this is happening. The major trust themes related to role perceptions, demonstrated competence, and the importance of good communication for the development of trust over time. The interaction between trust and role perceptions went beyond understanding each other's roles and professional identity. The level of trust related to the acceptance of each other's roles. The delivery of primary and community-based health services that crosses organisational boundaries adds a layer of complexity to interprofessional relationships. The roles of and role boundaries between and within professional groups and services are changing. The uncertainty and vulnerability associated with these changes has affected the level of trust and mistrust. Conclusions Collaboration across organisational boundaries remains challenging. Power dynamics and trust affect the strategic choices made by each health professional about whether to collaborate, with whom, and to what level. These decisions directly influenced patient experiences. Unlike the difficulties in shifting the balance of power in interprofessional relationships, trust and respect can be fostered through a mix of interventions aimed at building personal relationships and establishing agreed rules that govern collaborative care and that are perceived as fair. PMID:22413897

  10. The influence of power dynamics and trust on multidisciplinary collaboration: a qualitative case study of type 2 diabetes mellitus.

    PubMed

    McDonald, Julie; Jayasuriya, Rohan; Harris, Mark Fort

    2012-03-13

    Ongoing care for chronic conditions such as diabetes is best provided by a range of health professionals working together. There are challenges in achieving this where collaboration crosses organisational and sector boundaries. The aim of this article is to explore the influence of power dynamics and trust on collaboration between health professionals involved in the management of diabetes and their impact on patient experiences. A qualitative case study conducted in a rural city in Australia. Forty five health service providers from nineteen organisations (including fee-for-service practices and block funded public sector services) and eight patients from two services were purposively recruited. Data was collected through semi-structured interviews that were audio-taped and transcribed. A thematic analysis approach was used using a two-level coding scheme and cross-case comparisons. Three themes emerged in relation to power dynamics between health professionals: their use of power to protect their autonomy, power dynamics between private and public sector providers, and reducing their dependency on other health professionals to maintain their power. Despite the intention of government policies to support more shared decision-making, there is little evidence that this is happening. The major trust themes related to role perceptions, demonstrated competence, and the importance of good communication for the development of trust over time. The interaction between trust and role perceptions went beyond understanding each other's roles and professional identity. The level of trust related to the acceptance of each other's roles. The delivery of primary and community-based health services that crosses organisational boundaries adds a layer of complexity to interprofessional relationships. The roles of and role boundaries between and within professional groups and services are changing. The uncertainty and vulnerability associated with these changes has affected the level of trust and mistrust. Collaboration across organisational boundaries remains challenging. Power dynamics and trust affect the strategic choices made by each health professional about whether to collaborate, with whom, and to what level. These decisions directly influenced patient experiences. Unlike the difficulties in shifting the balance of power in interprofessional relationships, trust and respect can be fostered through a mix of interventions aimed at building personal relationships and establishing agreed rules that govern collaborative care and that are perceived as fair.

  11. The Advanced Software Development and Commercialization Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallopoulos, E.; Canfield, T.R.; Minkoff, M.

    1990-09-01

    This is the first of a series of reports pertaining to progress in the Advanced Software Development and Commercialization Project, a joint collaborative effort between the Center for Supercomputing Research and Development of the University of Illinois and the Computing and Telecommunications Division of Argonne National Laboratory. The purpose of this work is to apply techniques of parallel computing that were pioneered by University of Illinois researchers to mature computational fluid dynamics (CFD) and structural dynamics (SD) computer codes developed at Argonne. The collaboration in this project will bring this unique combination of expertise to bear, for the first time,more » on industrially important problems. By so doing, it will expose the strengths and weaknesses of existing techniques for parallelizing programs and will identify those problems that need to be solved in order to enable wide spread production use of parallel computers. Secondly, the increased efficiency of the CFD and SD codes themselves will enable the simulation of larger, more accurate engineering models that involve fluid and structural dynamics. In order to realize the above two goals, we are considering two production codes that have been developed at ANL and are widely used by both industry and Universities. These are COMMIX and WHAMS-3D. The first is a computational fluid dynamics code that is used for both nuclear reactor design and safety and as a design tool for the casting industry. The second is a three-dimensional structural dynamics code used in nuclear reactor safety as well as crashworthiness studies. These codes are currently available for both sequential and vector computers only. Our main goal is to port and optimize these two codes on shared memory multiprocessors. In so doing, we shall establish a process that can be followed in optimizing other sequential or vector engineering codes for parallel processors.« less

  12. Identification and Analysis of Critical Gaps in Nuclear Fuel Cycle Codes Required by the SINEMA Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adrian Miron; Joshua Valentine; John Christenson

    2009-10-01

    The current state of the art in nuclear fuel cycle (NFC) modeling is an eclectic mixture of codes with various levels of applicability, flexibility, and availability. In support of the advanced fuel cycle systems analyses, especially those by the Advanced Fuel Cycle Initiative (AFCI), Unviery of Cincinnati in collaboration with Idaho State University carried out a detailed review of the existing codes describing various aspects of the nuclear fuel cycle and identified the research and development needs required for a comprehensive model of the global nuclear energy infrastructure and the associated nuclear fuel cycles. Relevant information obtained on the NFCmore » codes was compiled into a relational database that allows easy access to various codes' properties. Additionally, the research analyzed the gaps in the NFC computer codes with respect to their potential integration into programs that perform comprehensive NFC analysis.« less

  13. DRA/NASA/ONERA Collaboration on Icing Research. Part 2; Prediction of Airfoil Ice Accretion

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Gent, R. W.; Guffond, Didier

    1997-01-01

    This report presents results from a joint study by DRA, NASA, and ONERA for the purpose of comparing, improving, and validating the aircraft icing computer codes developed by each agency. These codes are of three kinds: (1) water droplet trajectory prediction, (2) ice accretion modeling, and (3) transient electrothermal deicer analysis. In this joint study, the agencies compared their code predictions with each other and with experimental results. These comparison exercises were published in three technical reports, each with joint authorship. DRA published and had first authorship of Part 1 - Droplet Trajectory Calculations, NASA of Part 2 - Ice Accretion Prediction, and ONERA of Part 3 - Electrothermal Deicer Analysis. The results cover work done during the period from August 1986 to late 1991. As a result, all of the information in this report is dated. Where necessary, current information is provided to show the direction of current research. In this present report on ice accretion, each agency predicted ice shapes on two dimensional airfoils under icing conditions for which experimental ice shapes were available. In general, all three codes did a reasonable job of predicting the measured ice shapes. For any given experimental condition, one of the three codes predicted the general ice features (i.e., shape, impingement limits, mass of ice) somewhat better than did the other two. However, no single code consistently did better than the other two over the full range of conditions examined, which included rime, mixed, and glaze ice conditions. In several of the cases, DRA showed that the user's knowledge of icing can significantly improve the accuracy of the code prediction. Rime ice predictions were reasonably accurate and consistent among the codes, because droplets freeze on impact and the freezing model is simple. Glaze ice predictions were less accurate and less consistent among the codes, because the freezing model is more complex and is critically dependent upon unsubstantiated heat transfer and surface roughness models. Thus, heat transfer prediction methods used in the codes became the subject for a separate study in this report to compare predicted heat transfer coefficients with a limited experimental database of heat transfer coefficients for cylinders with simulated glaze and rime ice shapes. The codes did a good job of predicting heat transfer coefficients near the stagnation region of the ice shapes. But in the region of the ice horns, all three codes predicted heat transfer coefficients considerably higher than the measured values. An important conclusion of this study is that further research is needed to understand the finer detail of of the glaze ice accretion process and to develop improved glaze ice accretion models.

  14. Comparison of Effectiveness of Collaborative Learning Methods and Traditional Methods in Physics Classes at Northern Maine Technical College.

    ERIC Educational Resources Information Center

    Overlock, Terrence H., Sr.

    To determine the effect of collaborative learning methods on the success rate of physics students at Northern Maine Technical College (NMTC), a study was undertaken to compare the mean final exam scores of a students in a physics course taught by traditional lecture/lab methods to those in a group taught by collaborative techniques. The…

  15. Louder than words: power and conflict in interprofessional education articles, 1954–2013

    PubMed Central

    Paradis, Elise; Whitehead, Cynthia R

    2015-01-01

    Context Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Objectives Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict – elements central to interprofessional care – figure in the IPE literature. Methods We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Results Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. Conclusions The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. PMID:25800300

  16. Brain-CODE: A Secure Neuroinformatics Platform for Management, Federation, Sharing and Analysis of Multi-Dimensional Neuroscience Data

    PubMed Central

    Vaccarino, Anthony L.; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R.; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G.; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F. Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M.; Stuss, Donald T.; Theriault, Elizabeth; Evans, Kenneth R.

    2018-01-01

    Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute’s “Brain-CODE” is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care. PMID:29875648

  17. Louder than words: power and conflict in interprofessional education articles, 1954-2013.

    PubMed

    Paradis, Elise; Whitehead, Cynthia R

    2015-04-01

    Interprofessional education (IPE) aspires to enable collaborative practice. Current IPE offerings, although rapidly proliferating, lack evidence of efficacy and theoretical grounding. Our research aimed to explore the historical emergence of the field of IPE and to analyse the positioning of this academic field of inquiry. In particular, we sought to investigate the extent to which power and conflict - elements central to interprofessional care - figure in the IPE literature. We used a combination of deductive and inductive automated coding and manual coding to explore the contents of 2191 articles in the IPE literature published between 1954 and 2013. Inductive coding focused on the presence and use of the sociological (rather than statistical) version of power, which refers to hierarchies and asymmetries among the professions. Articles found to be centrally about power were then analysed using content analysis. Publications on IPE have grown exponentially in the past decade. Deductive coding of identified articles showed an emphasis on students, learning, programmes and practice. Automated inductive coding of titles and abstracts identified 129 articles potentially about power, but manual coding found that only six articles put power and conflict at the centre. Content analysis of these six articles revealed that two provided tentative explorations of power dynamics, one skirted around this issue, and three explicitly theorised and integrated power and conflict. The lack of attention to power and conflict in the IPE literature suggests that many educators do not foreground these issues. Education programmes are expected to transform individuals into effective collaborators, without heed to structural, organisational and institutional factors. In so doing, current constructions of IPE veil the problems that IPE attempts to solve. © 2015 The Authors Medical Education Published by John Wiley & Sons Ltd.

  18. Preparing healthcare students who participate in interprofessional education for interprofessional collaboration: A constructivist grounded theory study protocol.

    PubMed

    Bianchi, Monica; Bagnasco, Annamaria; Aleo, Giuseppe; Catania, Gianluca; Zanini, Milko Patrick; Timmins, Fiona; Carnevale, Franco; Sasso, Loredana

    2018-05-01

    This article presents a qualitative research protocol to explore and understand the interprofessional collaboration (IPC) preparation process implemented by clinical tutors and students of different professions involved in interprofessional education (IPE). Many studies have shown that IPE initiatives improve students' understanding of the roles and responsibilities of other professionals. This improves students' attitudes towards other professions, facilitating mutual respect, and IPC. However, there is limited information about how students are prepared to work collaboratively within interprofessional teams. This is a constructivist grounded theory (GT) study, which will involve data collection through in-depth semi-structured interviews (to 9-15 students and 6-9 clinical tutors), participant observations, and the analysis of documentation. After analysing, coding, integrating, and comparing the data if necessary, a second round of interviews could be conducted to explore any particularly interesting aspects or clarify any issues. This will then be followed by focused and theoretical coding. Qualitative data analysis will be conducted with the support of NVivo 10 software (Victoria, Australia). A better conceptual understanding will help to understand if IPE experiences have contributed to the acquisition of competencies considered important for IPC, and if they have facilitated the development of teamwork attitudes.

  19. Manifestations of metacognitive activity during the collaborative planning of chemistry practical investigations

    NASA Astrophysics Data System (ADS)

    Mathabathe, Kgadi Clarrie; Potgieter, Marietjie

    2017-07-01

    This paper elaborates a process followed to characterise manifestations of cognitive regulation during the collaborative planning of chemistry practical investigations. Metacognitive activity was defined as the demonstration of planning, monitoring, control and evaluation of cognitive activities by students while carrying out the chemistry task. Inherent in collaborative learning is the social aspect of metacognition, which in this study was evidenced in social cognitive regulation (notably of intra- and interpersonal metacognitive regulations) as groups of students went about planning their practical investigations. Discussions of two of the learning groups (n = 4; n = 3) as they planned the extended practical investigation were recorded, transcribed and analysed for indicators of any inherent metacognitive activity. The process of characterising the manifestations of metacognition resulted in the development of a coding system which specifies not only the regulatory strategies at play but the type of regulation (self or other), the area of regulation (cognition, task performance or behaviour) as well as the depth of regulatory contributions (high or low). The fine-grained coding system allowed for a finer theoretical elucidation of the social nature of metacognition. The implications of this study for metacognition and chemistry education research are highlighted.

  20. Team Software Development for Aerothermodynamic and Aerodynamic Analysis and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, N.; Atkins, H. L.; Bibb, K. L.; Biedron, R. T.; Carpenter, M. H.; Gnoffo, P. A.; Hammond, D. P.; Jones, W. T.; Kleb, W. L.; Lee-Rausch, E. M.

    2003-01-01

    A collaborative approach to software development is described. The approach employs the agile development techniques: project retrospectives, Scrum status meetings, and elements of Extreme Programming to efficiently develop a cohesive and extensible software suite. The software product under development is a fluid dynamics simulator for performing aerodynamic and aerothermodynamic analysis and design. The functionality of the software product is achieved both through the merging, with substantial rewrite, of separate legacy codes and the authorship of new routines. Examples of rapid implementation of new functionality demonstrate the benefits obtained with this agile software development process. The appendix contains a discussion of coding issues encountered while porting legacy Fortran 77 code to Fortran 95, software design principles, and a Fortran 95 coding standard.

  1. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    NASA Astrophysics Data System (ADS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O'Neill, B. J.; Nolting, C.; Edmon, P.; Donnert, J. M. F.; Jones, T. W.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  2. Collaborative Research and Development (CR&D) III Task Order 0090: Image Processing Framework: From Acquisition and Analysis to Archival Storage

    DTIC Science & Technology

    2013-05-01

    contract or a PhD di sse rtation typically are a " proo f- of-concept" code base that can onl y read a single set of inputs and are not designed ...AFRL-RX-WP-TR-2013-0210 COLLABORATIVE RESEARCH AND DEVELOPMENT (CR&D) III Task Order 0090: Image Processing Framework: From...public release; distribution unlimited. See additional restrictions described on inside pages. STINFO COPY AIR FORCE RESEARCH LABORATORY

  3. Threshold quantum cryptography

    NASA Astrophysics Data System (ADS)

    Tokunaga, Yuuki; Okamoto, Tatsuaki; Imoto, Nobuyuki

    2005-01-01

    We present the concept of threshold collaborative unitary transformation or threshold quantum cryptography, which is a kind of quantum version of threshold cryptography. Threshold quantum cryptography states that classical shared secrets are distributed to several parties and a subset of them, whose number is greater than a threshold, collaborates to compute a quantum cryptographic function, while keeping each share secretly inside each party. The shared secrets are reusable if no cheating is detected. As a concrete example of this concept, we show a distributed protocol (with threshold) of conjugate coding.

  4. OHD/HL - National Weather Hydrology Laboratory

    Science.gov Websites

    Organization Search NWS All NOAA Go Local forecast by "City, St" Search by city or zip code. Press enter or select the go button to submit request City, St Go Science Research and Collaboration Hydrology

  5. Colour cyclic code for Brillouin distributed sensors

    NASA Astrophysics Data System (ADS)

    Le Floch, Sébastien; Sauser, Florian; Llera, Miguel; Rochat, Etienne

    2015-09-01

    For the first time, a colour cyclic coding (CCC) is theoretically and experimentally demonstrated for Brillouin optical time-domain analysis (BOTDA) distributed sensors. Compared to traditional intensity-modulated cyclic codes, the code presents an additional gain of √2 while keeping the same number of sequences as for a colour coding. A comparison with a standard BOTDA sensor is realized and validates the theoretical coding gain.

  6. HEPLIB `91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  7. HEPLIB 91: International users meeting on the support and environments of high energy physics computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnstad, H.

    The purpose of this meeting is to discuss the current and future HEP computing support and environments from the perspective of new horizons in accelerator, physics, and computing technologies. Topics of interest to the Meeting include (but are limited to): the forming of the HEPLIB world user group for High Energy Physic computing; mandate, desirables, coordination, organization, funding; user experience, international collaboration; the roles of national labs, universities, and industry; range of software, Monte Carlo, mathematics, physics, interactive analysis, text processors, editors, graphics, data base systems, code management tools; program libraries, frequency of updates, distribution; distributed and interactive computing, datamore » base systems, user interface, UNIX operating systems, networking, compilers, Xlib, X-Graphics; documentation, updates, availability, distribution; code management in large collaborations, keeping track of program versions; and quality assurance, testing, conventions, standards.« less

  8. Empirical analysis on the human dynamics of blogging behavior on GitHub

    NASA Astrophysics Data System (ADS)

    Yan, Deng-Cheng; Wei, Zong-Wen; Han, Xiao-Pu; Wang, Bing-Hong

    2017-01-01

    GitHub is a social collaborative coding platform on which software developers not only collaborate on codes but also share knowledge through blogs using GitHub Pages. In this article, we analyze the blogging behavior of software developers on GitHub Pages. The results show that both the commit number and the inter-event time of two consecutive blogging actions follow heavy-tailed distribution. We further observe a significant variety of activity among individual developers, and a strongly positive correlation between the activity and the power-law exponent of the inter-event time distribution. We also find a difference between the user behaviors of GitHub Pages and other online systems which is driven by the diversity of users and length of contents. In addition, our result shows an obvious difference between the majority of developers and elite developers in their burstiness property.

  9. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes.more » 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.« less

  10. Two dimension MDW OCDMA code cross-correlation for reduction of phase induced intensity noise

    NASA Astrophysics Data System (ADS)

    Ahmed, Israa Sh.; Aljunid, Syed A.; Nordin, Junita M.; Dulaimi, Layth A. Khalil Al; Matem, Rima

    2017-11-01

    In this paper, we first review 2-D MDW code cross correlation equations and table to be improved significantly by using code correlation properties. These codes can be used in the synchronous optical CDMA systems for multi access interference cancellation and maximum suppress the phase induced intensity noise. Low Psr is due to the reduction of interference noise that is induced by the 2-D MDW code PIIN suppression. High data rate causes increases in BER, requires high effective power and severely deteriorates the system performance. The 2-D W/T MDW code has an excellent system performance where the value of PIIN is suppressed as low as possible at the optimum Psr with high data bit rate. The 2-D MDW code shows better tolerance to PIIN in comparison to others with enhanced system performance. We prove by numerical analysis that the PIIN maximally suppressed by MDW code through the minimizing property of cross correlation in comparison to 2-D PDC and 2-D MQC OCDMA code.scheme systems.

  11. Comparison of ORSAT and SCARAB Reentry Analysis Tools for a Generic Satellite Test Case

    NASA Technical Reports Server (NTRS)

    Kelley, Robert L.; Hill, Nicole M.; Rochelle, W. C.; Johnson, Nicholas L.; Lips, T.

    2010-01-01

    Reentry analysis is essential to understanding the consequences of the full life cycle of a spacecraft. Since reentry is a key factor in spacecraft development, NASA and ESA have separately developed tools to assess the survivability of objects during reentry. Criteria such as debris casualty area and impact energy are particularly important to understanding the risks posed to people on Earth. Therefore, NASA and ESA have undertaken a series of comparison studies of their respective reentry codes for verification and improvements in accuracy. The NASA Object Reentry Survival Analysis Tool (ORSAT) and the ESA Spacecraft Atmospheric Reentry and Aerothermal Breakup (SCARAB) reentry analysis tools serve as standard codes for reentry survivability assessment of satellites. These programs predict whether an object will demise during reentry and calculate the debris casualty area of objects determined to survive, establishing the reentry risk posed to the Earth's population by surviving debris. A series of test cases have been studied for comparison and the most recent uses "Testsat," a conceptual satellite composed of generic parts, defined to use numerous simple shapes and various materials for a better comparison of the predictions of these two codes. This study is an improvement on the others in this series because of increased consistency in modeling techniques and variables. The overall comparison demonstrated that the two codes arrive at similar results. Either most objects modeled resulted in close agreement between the two codes, or if the difference was significant, the variance could be explained as a case of semantics in the model definitions. This paper presents the main results of ORSAT and SCARAB for the Testsat case and discusses the sources of any discovered differences. Discussion of the results of previous comparisons is made for a summary of differences between the codes and lessons learned from this series of tests.

  12. Assessing collaborative computing: development of the Collaborative-Computing Observation Instrument (C-COI)

    NASA Astrophysics Data System (ADS)

    Israel, Maya; Wherfel, Quentin M.; Shehab, Saadeddine; Ramos, Evan A.; Metzger, Adam; Reese, George C.

    2016-07-01

    This paper describes the development, validation, and uses of the Collaborative Computing Observation Instrument (C-COI), a web-based analysis instrument that classifies individual and/or collaborative behaviors of students during computing problem-solving (e.g. coding, programming). The C-COI analyzes data gathered through video and audio screen recording software that captures students' computer screens as they program, and their conversations with their peers or adults. The instrument allows researchers to organize and quantify these data to track behavioral patterns that could be further analyzed for deeper understanding of persistence and/or collaborative interactions. The article provides a rationale for the C-COI including the development of a theoretical framework for measuring collaborative interactions in computer-mediated environments. This theoretical framework relied on the computer-supported collaborative learning literature related to adaptive help seeking, the joint problem-solving space in which collaborative computing occurs, and conversations related to outcomes and products of computational activities. Instrument development and validation also included ongoing advisory board feedback from experts in computer science, collaborative learning, and K-12 computing as well as classroom observations to test out the constructs in the C-COI. These processes resulted in an instrument with rigorous validation procedures and a high inter-rater reliability.

  13. Validating the BISON fuel performance code to integral LWR experiments

    DOE PAGES

    Williamson, R. L.; Gamble, K. A.; Perez, D. M.; ...

    2016-03-24

    BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to datemore » for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define priorities for ongoing code development and validation activities.« less

  14. Capturing Energy-Saving Opportunities: Improving Building Efficiency in Rajasthan through Energy Code Implementation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tan, Qing; Yu, Sha; Evans, Meredydd

    2016-05-01

    India adopted the Energy Conservation Building Code (ECBC) in 2007. Rajasthan is the first state to make ECBC mandatory at the state level. In collaboration with Malaviya National Institute of Technology (MNIT) Jaipur, Pacific Northwest National Laboratory (PNNL) has been working with Rajasthan to facilitate the implementation of ECBC. This report summarizes milestones made in Rajasthan and PNNL's contribution in institutional set-ups, capacity building, compliance enforcement and pilot building construction.

  15. The Earth System Documentation (ES-DOC) Software Process

    NASA Astrophysics Data System (ADS)

    Greenslade, M. A.; Murphy, S.; Treshansky, A.; DeLuca, C.; Guilyardi, E.; Denvil, S.

    2013-12-01

    Earth System Documentation (ES-DOC) is an international project supplying high-quality tools & services in support of earth system documentation creation, analysis and dissemination. It is nurturing a sustainable standards based documentation eco-system that aims to become an integral part of the next generation of exa-scale dataset archives. ES-DOC leverages open source software, and applies a software development methodology that places end-user narratives at the heart of all it does. ES-DOC has initially focused upon nurturing the Earth System Model (ESM) documentation eco-system and currently supporting the following projects: * Coupled Model Inter-comparison Project Phase 5 (CMIP5); * Dynamical Core Model Inter-comparison Project (DCMIP); * National Climate Predictions and Projections Platforms Quantitative Evaluation of Downscaling Workshop. This talk will demonstrate that ES-DOC implements a relatively mature software development process. Taking a pragmatic Agile process as inspiration, ES-DOC: * Iteratively develops and releases working software; * Captures user requirements via a narrative based approach; * Uses online collaboration tools (e.g. Earth System CoG) to manage progress; * Prototypes applications to validate their feasibility; * Leverages meta-programming techniques where appropriate; * Automates testing whenever sensibly feasible; * Streamlines complex deployments to a single command; * Extensively leverages GitHub and Pivotal Tracker; * Enforces strict separation of the UI from underlying API's; * Conducts code reviews.

  16. A systematic review of collaboration and network research in the public affairs literature: implications for public health practice and research.

    PubMed

    Varda, Danielle; Shoup, Jo Ann; Miller, Sara

    2012-03-01

    We explored and analyzed how findings from public affairs research can inform public health research and practice, specifically in the area of interorganizational collaboration, one of the most promising practice-based approaches in the public health field. We conducted a systematic review of the public affairs literature by following a grounded theory approach. We coded 151 articles for demographics and empirical findings (n = 258). Three primary findings stand out in the public affairs literature: network structure affects governance, management strategies exist for administrators, and collaboration can be linked to outcomes. These findings are linked to priorities in public health practice. Overall, we found that public affairs has a long and rich history of research in collaborations that offers unique organizational theory and management tools to public health practitioners.

  17. Progress along developmental tracks for electronic health records implementation in the United States

    PubMed Central

    Hollar, David W

    2009-01-01

    The development and implementation of electronic health records (EHR) have occurred slowly in the United States. To date, these approaches have, for the most part, followed four developmental tracks: (a) Enhancement of immunization registries and linkage with other health records to produce Child Health Profiles (CHP), (b) Regional Health Information Organization (RHIO) demonstration projects to link together patient medical records, (c) Insurance company projects linked to ICD-9 codes and patient records for cost-benefit assessments, and (d) Consortia of EHR developers collaborating to model systems requirements and standards for data linkage. Until recently, these separate efforts have been conducted in the very silos that they had intended to eliminate, and there is still considerable debate concerning health professionals access to as well as commitment to using EHR if these systems are provided. This paper will describe these four developmental tracks, patient rights and the legal environment for EHR, international comparisons, and future projections for EHR expansion across health networks in the United States. PMID:19291284

  18. Comparison of space radiation calculations for deterministic and Monte Carlo transport codes

    NASA Astrophysics Data System (ADS)

    Lin, Zi-Wei; Adams, James; Barghouty, Abdulnasser; Randeniya, Sharmalee; Tripathi, Ram; Watts, John; Yepes, Pablo

    For space radiation protection of astronauts or electronic equipments, it is necessary to develop and use accurate radiation transport codes. Radiation transport codes include deterministic codes, such as HZETRN from NASA and UPROP from the Naval Research Laboratory, and Monte Carlo codes such as FLUKA, the Geant4 toolkit and HETC-HEDS. The deterministic codes and Monte Carlo codes complement each other in that deterministic codes are very fast while Monte Carlo codes are more elaborate. Therefore it is important to investigate how well the results of deterministic codes compare with those of Monte Carlo transport codes and where they differ. In this study we evaluate these different codes in their space radiation applications by comparing their output results in the same given space radiation environments, shielding geometry and material. Typical space radiation environments such as the 1977 solar minimum galactic cosmic ray environment are used as the well-defined input, and simple geometries made of aluminum, water and/or polyethylene are used to represent the shielding material. We then compare various outputs of these codes, such as the dose-depth curves and the flux spectra of different fragments and other secondary particles. These comparisons enable us to learn more about the main differences between these space radiation transport codes. At the same time, they help us to learn the qualitative and quantitative features that these transport codes have in common.

  19. Numerical algorithm comparison for the accurate and efficient computation of high-incidence vortical flow

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.

    1991-01-01

    Computations from two Navier-Stokes codes, NSS and F3D, are presented for a tangent-ogive-cylinder body at high angle of attack. Features of this steady flow include a pair of primary vortices on the leeward side of the body as well as secondary vortices. The topological and physical plausibility of this vortical structure is discussed. The accuracy of these codes are assessed by comparison of the numerical solutions with experimental data. The effects of turbulence model, numerical dissipation, and grid refinement are presented. The overall efficiency of these codes are also assessed by examining their convergence rates, computational time per time step, and maximum allowable time step for time-accurate computations. Overall, the numerical results from both codes compared equally well with experimental data, however, the NSS code was found to be significantly more efficient than the F3D code.

  20. Toward a CFD nose-to-tail capability - Hypersonic unsteady Navier-Stokes code validation

    NASA Technical Reports Server (NTRS)

    Edwards, Thomas A.; Flores, Jolen

    1989-01-01

    Computational fluid dynamics (CFD) research for hypersonic flows presents new problems in code validation because of the added complexity of the physical models. This paper surveys code validation procedures applicable to hypersonic flow models that include real gas effects. The current status of hypersonic CFD flow analysis is assessed with the Compressible Navier-Stokes (CNS) code as a case study. The methods of code validation discussed to beyond comparison with experimental data to include comparisons with other codes and formulations, component analyses, and estimation of numerical errors. Current results indicate that predicting hypersonic flows of perfect gases and equilibrium air are well in hand. Pressure, shock location, and integrated quantities are relatively easy to predict accurately, while surface quantities such as heat transfer are more sensitive to the solution procedure. Modeling transition to turbulence needs refinement, though preliminary results are promising.

  1. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  2. Modeling of ion orbit loss and intrinsic toroidal rotation with the COGENT code

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Cohen, R.; Rognlien, T.; Hittinger, J.

    2014-10-01

    We discuss recent advances in cross-separatrix neoclassical transport simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The COGENT code models the axisymmetric transport properties of edge plasmas including the effects of nonlinear (Fokker-Planck) collisions and a self-consistent electrostatic potential. Our recent work has focused on studies of ion orbit loss and the associated toroidal rotation driven by this mechanism. The results of the COGENT simulations are discussed and analyzed for the parameters of the DIII-D experiment. Work performed for USDOE at LLNL under Contract DE-AC52-07NA27344.

  3. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  4. Multi-dimensional free-electron laser simulation codes : a comparison study.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biedron, S. G.; Chae, Y. C.; Dejus, R. J.

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  5. Multi-Dimensional Free-Electron Laser Simulation Codes: A Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuhn, Heinz-Dieter

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  6. The APS SASE FEL : modeling and code comparison.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biedron, S. G.

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  7. Comparison of reversible methods for data compression

    NASA Astrophysics Data System (ADS)

    Heer, Volker K.; Reinfelder, Hans-Erich

    1990-07-01

    Widely differing methods for data compression described in the ACR-NEMA draft are used in medical imaging. In our contribution we will review various methods briefly and discuss the relevant advantages and disadvantages. In detail we evaluate 1st order DPCM pyramid transformation and S transformation. We compare as coding algorithms both fixed and adaptive Huffman coding and Lempel-Ziv coding. Our comparison is performed on typical medical images from CT MR DSA and DLR (Digital Luminescence Radiography). Apart from the achieved compression factors we take into account CPU time required and main memory requirement both for compression and for decompression. For a realistic comparison we have implemented the mentioned algorithms in the C program language on a MicroVAX II and a SPARC station 1. 2.

  8. Comparison of Evolutionary (Genetic) Algorithm and Adjoint Methods for Multi-Objective Viscous Airfoil Optimizations

    NASA Technical Reports Server (NTRS)

    Pulliam, T. H.; Nemec, M.; Holst, T.; Zingg, D. W.; Kwak, Dochan (Technical Monitor)

    2002-01-01

    A comparison between an Evolutionary Algorithm (EA) and an Adjoint-Gradient (AG) Method applied to a two-dimensional Navier-Stokes code for airfoil design is presented. Both approaches use a common function evaluation code, the steady-state explicit part of the code,ARC2D. The parameterization of the design space is a common B-spline approach for an airfoil surface, which together with a common griding approach, restricts the AG and EA to the same design space. Results are presented for a class of viscous transonic airfoils in which the optimization tradeoff between drag minimization as one objective and lift maximization as another, produces the multi-objective design space. Comparisons are made for efficiency, accuracy and design consistency.

  9. Radiant Energy Measurements from a Scaled Jet Engine Axisymmetric Exhaust Nozzle for a Baseline Code Validation Case

    NASA Technical Reports Server (NTRS)

    Baumeister, Joseph F.

    1994-01-01

    A non-flowing, electrically heated test rig was developed to verify computer codes that calculate radiant energy propagation from nozzle geometries that represent aircraft propulsion nozzle systems. Since there are a variety of analysis tools used to evaluate thermal radiation propagation from partially enclosed nozzle surfaces, an experimental benchmark test case was developed for code comparison. This paper briefly describes the nozzle test rig and the developed analytical nozzle geometry used to compare the experimental and predicted thermal radiation results. A major objective of this effort was to make available the experimental results and the analytical model in a format to facilitate conversion to existing computer code formats. For code validation purposes this nozzle geometry represents one validation case for one set of analysis conditions. Since each computer code has advantages and disadvantages based on scope, requirements, and desired accuracy, the usefulness of this single nozzle baseline validation case can be limited for some code comparisons.

  10. ICCE/ICCAI 2000 Full & Short Papers (Collaborative Learning).

    ERIC Educational Resources Information Center

    2000

    This document contains the full and short papers on collaborative learning from ICCE/ICCAI 2000 (International Conference on Computers in Education/International Conference on Computer-Assisted Instruction) covering the following topics: comparison of applying Internet to cooperative and traditional learning; a distributed backbone system for…

  11. Impact of dynamic rate coding aspects of mobile phone networks on forensic voice comparison.

    PubMed

    Alzqhoul, Esam A S; Nair, Balamurali B T; Guillemin, Bernard J

    2015-09-01

    Previous studies have shown that landline and mobile phone networks are different in their ways of handling the speech signal, and therefore in their impact on it. But the same is also true of the different networks within the mobile phone arena. There are two major mobile phone technologies currently in use today, namely the global system for mobile communications (GSM) and code division multiple access (CDMA) and these are fundamentally different in their design. For example, the quality of the coded speech in the GSM network is a function of channel quality, whereas in the CDMA network it is determined by channel capacity (i.e., the number of users sharing a cell site). This paper examines the impact on the speech signal of a key feature of these networks, namely dynamic rate coding, and its subsequent impact on the task of likelihood-ratio-based forensic voice comparison (FVC). Surprisingly, both FVC accuracy and precision are found to be better for both GSM- and CDMA-coded speech than for uncoded. Intuitively one expects FVC accuracy to increase with increasing coded speech quality. This trend is shown to occur for the CDMA network, but, surprisingly, not for the GSM network. Further, in respect to comparisons between these two networks, FVC accuracy for CDMA-coded speech is shown to be slightly better than for GSM-coded speech, particularly when the coded-speech quality is high, but in terms of FVC precision the two networks are shown to be very similar. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Supporting open collaboration in science through explicit and linked semantic description of processes

    USGS Publications Warehouse

    Gil, Yolanda; Michel, Felix; Ratnakar, Varun; Read, Jordan S.; Hauder, Matheus; Duffy, Christopher; Hanson, Paul C.; Dugan, Hilary

    2015-01-01

    The Web was originally developed to support collaboration in science. Although scientists benefit from many forms of collaboration on the Web (e.g., blogs, wikis, forums, code sharing, etc.), most collaborative projects are coordinated over email, phone calls, and in-person meetings. Our goal is to develop a collaborative infrastructure for scientists to work on complex science questions that require multi-disciplinary contributions to gather and analyze data, that cannot occur without significant coordination to synthesize findings, and that grow organically to accommodate new contributors as needed as the work evolves over time. Our approach is to develop an organic data science framework based on a task-centered organization of the collaboration, includes principles from social sciences for successful on-line communities, and exposes an open science process. Our approach is implemented as an extension of a semantic wiki platform, and captures formal representations of task decomposition structures, relations between tasks and users, and other properties of tasks, data, and other relevant science objects. All these entities are captured through the semantic wiki user interface, represented as semantic web objects, and exported as linked data.

  13. The creation of innovation through public-private collaboration.

    PubMed

    Esteve, Marc; Ysa, Tamyko; Longo, Francisco

    2012-09-01

    This article develops the notion of how different options of public-private collaborations implemented by organizations affect the creation of innovation through a case study: the Blood and Tissue Bank. Data were obtained through in-depth semi-structured interviews with the entire managerial team of the organization under analysis. We coded the interviews, and implemented content analysis. These data were triangulated with the analysis of the organization's internal documents. This article contributes to the understanding of innovation management in public-private collaborations in health professions by identifying the existence of different options in an organization to develop collaborative innovation among the public and the private sectors: contracts, contractual public-private partnership, and institutionalised public-private partnership. We observed that the creation of innovation is directly related to the institutional arrangement chosen to develop each project. Thus, certain innovations are unfeasible without a high degree of maturity in the interorganizational collaboration. However, it is also noteworthy that as the intensity of the collaboration increases, so do costs, and control over the process decreases. Copyright © 2012 Sociedad Española de Cardiología. Published by Elsevier Espana. All rights reserved.

  14. Verification of Three Dimensional Triangular Prismatic Discrete Ordinates Transport Code ENSEMBLE-TRIZ by Comparison with Monte Carlo Code GMVP

    NASA Astrophysics Data System (ADS)

    Homma, Yuto; Moriwaki, Hiroyuki; Ohki, Shigeo; Ikeda, Kazumi

    2014-06-01

    This paper deals with verification of three dimensional triangular prismatic discrete ordinates transport calculation code ENSEMBLE-TRIZ by comparison with multi-group Monte Carlo calculation code GMVP in a large fast breeder reactor. The reactor is a 750 MWe electric power sodium cooled reactor. Nuclear characteristics are calculated at beginning of cycle of an initial core and at beginning and end of cycle of equilibrium core. According to the calculations, the differences between the two methodologies are smaller than 0.0002 Δk in the multi-plication factor, relatively about 1% in the control rod reactivity, and 1% in the sodium void reactivity.

  15. Improving the coding and classification of ambulance data through the application of International Classification of Disease 10th revision.

    PubMed

    Cantwell, Kate; Morgans, Amee; Smith, Karen; Livingston, Michael; Dietze, Paul

    2014-02-01

    This paper aims to examine whether an adaptation of the International Classification of Disease (ICD) coding system can be applied retrospectively to final paramedic assessment data in an ambulance dataset with a view to developing more fine-grained, clinically relevant case definitions than are available through point-of-call data. Over 1.2 million case records were extracted from the Ambulance Victoria data warehouse. Data fields included dispatch code, cause (CN) and final primary assessment (FPA). Each FPA was converted to an ICD-10-AM code using word matching or best fit. ICD-10-AM codes were then converted into Major Diagnostic Categories (MDC). CN was aligned with the ICD-10-AM codes for external cause of morbidity and mortality. The most accurate results were obtained when ICD-10-AM codes were assigned using information from both FPA and CN. Comparison of cases coded as unconscious at point-of-call with the associated paramedic assessment highlighted the extra clinical detail obtained when paramedic assessment data are used. Ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Coding of ambulance data using ICD-10-AM allows for comparison of not only ambulance service users but also with other population groups. WHAT IS KNOWN ABOUT THE TOPIC? There is no reliable and standard coding and categorising system for paramedic assessment data contained in ambulance service databases. WHAT DOES THIS PAPER ADD? This study demonstrates that ambulance paramedic assessment data can be aligned with ICD-10-AM and MDC with relative ease, allowing retrospective coding of large datasets. Representation of ambulance case types using ICD-10-AM-coded information obtained after paramedic assessment is more fine grained and clinically relevant than point-of-call data, which uses caller information before ambulance attendance. WHAT ARE THE IMPLICATIONS FOR PRACTITIONERS? This paper describes a model of coding using an internationally recognised standard coding and categorising system to support analysis of paramedic assessment. Ambulance data coded using ICD-10-AM allows for reliable reporting and comparison within the prehospital setting and across the healthcare industry.

  16. PolyPole-1: An accurate numerical algorithm for intra-granular fission gas release

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pizzocri, D.; Rabiti, C.; Luzzi, L.

    2016-09-01

    This paper describes the development of a new numerical algorithm (called PolyPole-1) to efficiently solve the equation for intra-granular fission gas release in nuclear fuel. The work was carried out in collaboration with Politecnico di Milano and Institute for Transuranium Elements. The PolyPole-1 algorithms is being implemented in INL's fuels code BISON code as part of BISON's fission gas release model. The transport of fission gas from within the fuel grains to the grain boundaries (intra-granular fission gas release) is a fundamental controlling mechanism of fission gas release and gaseous swelling in nuclear fuel. Hence, accurate numerical solution of themore » corresponding mathematical problem needs to be included in fission gas behaviour models used in fuel performance codes. Under the assumption of equilibrium between trapping and resolution, the process can be described mathematically by a single diffusion equation for the gas atom concentration in a grain. In this work, we propose a new numerical algorithm (PolyPole-1) to efficiently solve the fission gas diffusion equation in time-varying conditions. The PolyPole-1 algorithm is based on the analytic modal solution of the diffusion equation for constant conditions, with the addition of polynomial corrective terms that embody the information on the deviation from constant conditions. The new algorithm is verified by comparing the results to a finite difference solution over a large number of randomly generated operation histories. Furthermore, comparison to state-of-the-art algorithms used in fuel performance codes demonstrates that the accuracy of the PolyPole-1 solution is superior to other algorithms, with similar computational effort. Finally, the concept of PolyPole-1 may be extended to the solution of the general problem of intra-granular fission gas diffusion during non-equilibrium trapping and resolution, which will be the subject of future work.« less

  17. Coded Cooperation for Multiway Relaying in Wireless Sensor Networks †

    PubMed Central

    Si, Zhongwei; Ma, Junyang; Thobaben, Ragnar

    2015-01-01

    Wireless sensor networks have been considered as an enabling technology for constructing smart cities. One important feature of wireless sensor networks is that the sensor nodes collaborate in some manner for communications. In this manuscript, we focus on the model of multiway relaying with full data exchange where each user wants to transmit and receive data to and from all other users in the network. We derive the capacity region for this specific model and propose a coding strategy through coset encoding. To obtain good performance with practical codes, we choose spatially-coupled LDPC (SC-LDPC) codes for the coded cooperation. In particular, for the message broadcasting from the relay, we construct multi-edge-type (MET) SC-LDPC codes by repeatedly applying coset encoding. Due to the capacity-achieving property of the SC-LDPC codes, we prove that the capacity region can theoretically be achieved by the proposed MET SC-LDPC codes. Numerical results with finite node degrees are provided, which show that the achievable rates approach the boundary of the capacity region in both binary erasure channels and additive white Gaussian channels. PMID:26131675

  18. Coded Cooperation for Multiway Relaying in Wireless Sensor Networks.

    PubMed

    Si, Zhongwei; Ma, Junyang; Thobaben, Ragnar

    2015-06-29

    Wireless sensor networks have been considered as an enabling technology for constructing smart cities. One important feature of wireless sensor networks is that the sensor nodes collaborate in some manner for communications. In this manuscript, we focus on the model of multiway relaying with full data exchange where each user wants to transmit and receive data to and from all other users in the network. We derive the capacity region for this specific model and propose a coding strategy through coset encoding. To obtain good performance with practical codes, we choose spatially-coupled LDPC (SC-LDPC) codes for the coded cooperation. In particular, for the message broadcasting from the relay, we construct multi-edge-type (MET) SC-LDPC codes by repeatedly applying coset encoding. Due to the capacity-achieving property of the SC-LDPC codes, we prove that the capacity region can theoretically be achieved by the proposed MET SC-LDPC codes. Numerical results with finite node degrees are provided, which show that the achievable rates approach the boundary of the capacity region in both binary erasure channels and additive white Gaussian channels.

  19. A verification of the gyrokinetic microstability codes GEM, GYRO, and GS2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, R. V.; Chen, Y.; Wan, W.

    2013-10-15

    A previous publication [R. V. Bravenec et al., Phys. Plasmas 18, 122505 (2011)] presented favorable comparisons of linear frequencies and nonlinear fluxes from the Eulerian gyrokinetic codes gyro[J. Candy and R. E. Waltz, J. Comput. Phys. 186, 545 (2003)] and gs2[W. Dorland et al., Phys. Rev. Lett. 85, 5579 (2000)]. The motivation was to verify the codes, i.e., demonstrate that they correctly solve the gyrokinetic-Maxwell equations. The premise was that it is highly unlikely for both codes to yield the same incorrect results. In this work, we add the Lagrangian particle-in-cell code gem[Y. Chen and S. Parker, J. Comput. Phys.more » 220, 839 (2007)] to the comparisons, not simply to add another code, but also to demonstrate that the codes' algorithms do not matter. We find good agreement of gem with gyro and gs2 for the plasma conditions considered earlier, thus establishing confidence that the codes are verified and that ongoing validation efforts for these plasma parameters are warranted.« less

  20. Multidisciplinary Analysis and Optimal Design: As Easy as it Sounds?

    NASA Technical Reports Server (NTRS)

    Moore, Greg; Chainyk, Mike; Schiermeier, John

    2004-01-01

    The viewgraph presentation examines optimal design for precision, large aperture structures. Discussion focuses on aspects of design optimization, code architecture and current capabilities, and planned activities and collaborative area suggestions. The discussion of design optimization examines design sensitivity analysis; practical considerations; and new analytical environments including finite element-based capability for high-fidelity multidisciplinary analysis, design sensitivity, and optimization. The discussion of code architecture and current capabilities includes basic thermal and structural elements, nonlinear heat transfer solutions and process, and optical modes generation.

  1. Results of SEI Independent Research and Development Projects and Report on Emerging Technologies and Technology Trends

    DTIC Science & Technology

    2004-10-01

    Top-Level Process for Identification and Analysis of Safety-Related Re- quirements 4.4 Collaborators The primary SEI team members were Don Firesmith...Graff, M. & van Wyk, K. Secure Coding Principles & Practices. O’Reilly, 2003. • Hoglund, G. & McGraw, G. Exploiting Software: How to Break Code. Addison...Eisenecker, U.; Glück, R.; Vandevoorde, D.; & Veldhuizen , T. “Generative Programming and Active Libraries (Extended Abstract)” <osl.iu.edu/~tveldhui/papers

  2. An implementation of a security infrastructure compliant with the Italian Personal Data Protection Code in a web-based cooperative work system.

    PubMed

    Eccher, Claudio; Eccher, Lorenzo; Izzo, Umberto

    2005-01-01

    In this poster we describe the security solutions implemented in a web-based cooperative work frame-work for managing heart failure patients among different health care professionals involved in the care process. The solution, developed in close collaboration with the Law Department of the University of Trento, is compliant with the new Italian Personal Data Protection Code, issued in 2003, that regulates also the storing and processing of health data.

  3. Profugus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Thomas; Hamilton, Steven; Slattery, Stuart

    Profugus is an open-source mini-application (mini-app) for radiation transport and reactor applications. It contains the fundamental computational kernels used in the Exnihilo code suite from Oak Ridge National Laboratory. However, Exnihilo is production code with a substantial user base. Furthermore, Exnihilo is export controlled. This makes collaboration with computer scientists and computer engineers difficult. Profugus is designed to bridge that gap. By encapsulating the core numerical algorithms in an abbreviated code base that is open-source, computer scientists can analyze the algorithms and easily make code-architectural changes to test performance without compromising the production code values of Exnihilo. Profugus is notmore » meant to be production software with respect to problem analysis. The computational kernels in Profugus are designed to analyze performance, not correctness. Nonetheless, users of Profugus can setup and run problems with enough real-world features to be useful as proof-of-concept for actual production work.« less

  4. Error control techniques for satellite and space communications

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.

    1989-01-01

    The performance of bandwidth efficient trellis codes on channels with phase jitter, or those disturbed by jamming and impulse noise is analyzed. An heuristic algorithm for construction of bandwidth efficient trellis codes with any constraint length up to about 30, any signal constellation, and any code rate was developed. Construction of good distance profile trellis codes for sequential decoding and comparison of random coding bounds of trellis coded modulation schemes are also discussed.

  5. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it maymore » be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.« less

  6. Overview of Edge Simulation Laboratory (ESL)

    NASA Astrophysics Data System (ADS)

    Cohen, R. H.; Dorr, M.; Hittinger, J.; Rognlien, T.; Umansky, M.; Xiong, A.; Xu, X.; Belli, E.; Candy, J.; Snyder, P.; Colella, P.; Martin, D.; Sternberg, T.; van Straalen, B.; Bodi, K.; Krasheninnikov, S.

    2006-10-01

    The ESL is a new collaboration to build a full-f electromagnetic gyrokinetic code for tokamak edge plasmas using continuum methods. Target applications are edge turbulence and transport (neoclassical and anomalous), and edge-localized modes. Initially the project has three major threads: (i) verification and validation of TEMPEST, the project's initial (electrostatic) edge code which can be run in 4D (neoclassical and transport-timescale applications) or 5D (turbulence); (ii) design of the next generation code, which will include more complete physics (electromagnetics, fluid equation option, improved collisions) and advanced numerics (fully conservative, high-order discretization, mapped multiblock grids, adaptivity), and (iii) rapid-prototype codes to explore the issues attached to solving fully nonlinear gyrokinetics with steep radial gradiens. We present a brief summary of the status of each of these activities.

  7. An exploration of nurse-physician perceptions of collaborative behaviour.

    PubMed

    Collette, Alice E; Wann, Kristen; Nevin, Meredith L; Rique, Karen; Tarrant, Grant; Hickey, Lorraine A; Stichler, Jaynelle F; Toole, Belinda M; Thomason, Tanna

    2017-07-01

    Interprofessional collaboration is a key element in providing safe, holistic patient care in the acute care setting. Trended data at a community hospital indicated opportunities for improvement in collaboration on micro, meso, and macro levels. The aim of this survey study was to assess the current state of collaboration between frontline nurses and physicians at a non-academic acute care hospital. A convenience sample of participants was recruited with a final respondent sample of 355 nurses and 82 physicians. The results indicated that physicians generally perceived greater collaboration than nurses. Physician ratings did not vary by primary practice area, whereas nurse ratings varied by clinical practice area. Nurse ratings were the lowest in the operating room and the highest in the emergency department. Text-based responses to an open-ended question were analysed by role and coded by two independent research teams. Emergent themes emphasised the importance of rounding, roles, respect, and communication. Despite recognition of the need for improved collaboration and relational behaviours, strategies to improve collaborative practice must be fostered at the meso level by organisational leaders and customised to address micro-level values. At the study site, findings have been used to address and improve collaboration towards the goal of becoming a high reliability organisation.

  8. International comparison of sudden unexpected death in infancy rates using a newly proposed set of cause-of-death codes.

    PubMed

    Taylor, Barry J; Garstang, Joanna; Engelberts, Adele; Obonai, Toshimasa; Cote, Aurore; Freemantle, Jane; Vennemann, Mechtild; Healey, Matt; Sidebotham, Peter; Mitchell, Edwin A; Moon, Rachel Y

    2015-11-01

    Comparing rates of sudden unexpected death in infancy (SUDI) in different countries and over time is difficult, as these deaths are certified differently in different countries, and, even within the same jurisdiction, changes in this death certification process have occurred over time. To identify if International Classification of Diseases-10 (ICD-10) codes are being applied differently in different countries, and to develop a more robust tool for international comparison of these types of deaths. Usage of six ICD-10 codes, which code for the majority of SUDI, was compared for the years 2002-2010 in eight high-income countries. There was a great variability in how each country codes SUDI. For example, the proportion of SUDI coded as sudden infant death syndrome (R95) ranged from 32.6% in Japan to 72.5% in Germany. The proportion of deaths coded as accidental suffocation and strangulation in bed (W75) ranged from 1.1% in Germany to 31.7% in New Zealand. Japan was the only country to consistently use the R96 code, with 44.8% of SUDI attributed to that code. The lowest, overall, SUDI rate was seen in the Netherlands (0.19/1000 live births (LB)), and the highest in New Zealand (1.00/1000 LB). SUDI accounted for one-third to half of postneonatal mortality in 2002-2010 for all of the countries except for the Netherlands. The proposed set of ICD-10 codes encompasses the codes used in different countries for most SUDI cases. Use of these codes will allow for better international comparisons and tracking of trends over time. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. National Centers for Environmental Prediction

    Science.gov Websites

    / VISION | About EMC EMC > NAM > Home NAM Operational Products HIRESW Operational Products Operational Forecast Graphics Experimental Forecast Graphics Verification and Diagnostics Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts

  10. A Large Scale Code Resolution Service Network in the Internet of Things

    PubMed Central

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-01-01

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207

  11. A large scale code resolution service network in the Internet of Things.

    PubMed

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-11-07

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.

  12. Improving the accuracy of operation coding in surgical discharge summaries

    PubMed Central

    Martinou, Eirini; Shouls, Genevieve; Betambeau, Nadine

    2014-01-01

    Procedural coding in surgical discharge summaries is extremely important; as well as communicating to healthcare staff which procedures have been performed, it also provides information that is used by the hospital's coding department. The OPCS code (Office of Population, Censuses and Surveys Classification of Surgical Operations and Procedures) is used to generate the tariff that allows the hospital to be reimbursed for the procedure. We felt that the OPCS coding on discharge summaries was often incorrect within our breast and endocrine surgery department. A baseline measurement over two months demonstrated that 32% of operations had been incorrectly coded, resulting in an incorrect tariff being applied and an estimated loss to the Trust of £17,000. We developed a simple but specific OPCS coding table in collaboration with the clinical coding team and breast surgeons that summarised all operations performed within our department. This table was disseminated across the team, specifically to the junior doctors who most frequently complete the discharge summaries. Re-audit showed 100% of operations were accurately coded, demonstrating the effectiveness of the coding table. We suggest that specifically designed coding tables be introduced across each surgical department to ensure accurate OPCS codes are used to produce better quality surgical discharge summaries and to ensure correct reimbursement to the Trust. PMID:26734286

  13. GPS receiver CODE bias estimation: A comparison of two methods

    NASA Astrophysics Data System (ADS)

    McCaffrey, Anthony M.; Jayachandran, P. T.; Themens, D. R.; Langley, R. B.

    2017-04-01

    The Global Positioning System (GPS) is a valuable tool in the measurement and monitoring of ionospheric total electron content (TEC). To obtain accurate GPS-derived TEC, satellite and receiver hardware biases, known as differential code biases (DCBs), must be estimated and removed. The Center for Orbit Determination in Europe (CODE) provides monthly averages of receiver DCBs for a significant number of stations in the International Global Navigation Satellite Systems Service (IGS) network. A comparison of the monthly receiver DCBs provided by CODE with DCBs estimated using the minimization of standard deviations (MSD) method on both daily and monthly time intervals, is presented. Calibrated TEC obtained using CODE-derived DCBs, is accurate to within 0.74 TEC units (TECU) in differenced slant TEC (sTEC), while calibrated sTEC using MSD-derived DCBs results in an accuracy of 1.48 TECU.

  14. Design of Scalable and Effective Earth Science Collaboration Tool

    NASA Astrophysics Data System (ADS)

    Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.

    2014-12-01

    Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation Suite (NEOS3).

  15. Comparing the coding of complications in Queensland and Victorian admitted patient data.

    PubMed

    Michel, Jude L; Cheng, Diana; Jackson, Terri J

    2011-08-01

    To examine differences between Queensland and Victorian coding of hospital-acquired conditions and suggest ways to improve the usefulness of these data in the monitoring of patient safety events. Secondary analysis of admitted patient episode data collected in Queensland and Victoria. Comparison of depth of coding, and patterns in the coding of ten commonly coded complications of five elective procedures. Comparison of the mean complication codes assigned per episode revealed Victoria assigns more valid codes than Queensland for all procedures, with the difference between the states being significantly different in all cases. The proportion of the codes flagged as complications was consistently lower for Queensland when comparing 10 common complications for each of the five selected elective procedures. The estimated complication rates for the five procedures showed Victoria to have an apparently higher complication rate than Queensland for 35 of the 50 complications examined. Our findings demonstrate that the coding of complications is more comprehensive in Victoria than in Queensland. It is known that inconsistencies exist between states in routine hospital data quality. Comparative use of patient safety indicators should be viewed with caution until standards are improved across Australia. More exploration of data quality issues is needed to identify areas for improvement.

  16. A Systematic Review of Collaboration and Network Research in the Public Affairs Literature: Implications for Public Health Practice and Research

    PubMed Central

    Shoup, Jo Ann; Miller, Sara

    2012-01-01

    Objectives. We explored and analyzed how findings from public affairs research can inform public health research and practice, specifically in the area of interorganizational collaboration, one of the most promising practice-based approaches in the public health field. Methods. We conducted a systematic review of the public affairs literature by following a grounded theory approach. We coded 151 articles for demographics and empirical findings (n = 258). Results. Three primary findings stand out in the public affairs literature: network structure affects governance, management strategies exist for administrators, and collaboration can be linked to outcomes. These findings are linked to priorities in public health practice. Conclusions. Overall, we found that public affairs has a long and rich history of research in collaborations that offers unique organizational theory and management tools to public health practitioners. PMID:22021311

  17. Building a standards-based and collaborative e-prescribing tool: MyRxPad.

    PubMed

    Nelson, Stuart J; Zeng, Kelly; Kilbourne, John

    2011-01-01

    MyRxPad (rxp.nlm.nih.gov) is a prototype application intended to enable a practitioner-patient collaborative approach towards e-prescribing: patients play an active role by maintaining up-to-date and accurate medication lists. Prescribers make well-informed and safe prescribing decisions based on personal medication records contributed by patients. MyRxPad is thus the vehicle for collaborations with patients using MyMedicationList (MML). Integration with personal medication records in the context of e-prescribing is thus enabled. We present our experience in applying RxNorm in an e-prescribing setting: using standard names and codes to capture prescribed medication as well as extracting information from RxNorm to support medication-related clinical decision.

  18. A Comparison between Collaborative and Authoritative Leadership Styles of Special Education Administrators

    ERIC Educational Resources Information Center

    Veale, Natasha W.

    2010-01-01

    Supervisors, administrators, and directors of special education usually use the authoritative leadership style when supervising their special education staffs; however, collaborative leadership styles are slowly overtaking authoritative leadership styles. These leaders have the task of producing an environment where the culture is inclusive, the…

  19. Collaborative Strategic Reading for Students with Learning Disabilities in Upper Elementary Classrooms

    ERIC Educational Resources Information Center

    Boardman, Alison G.; Vaughn, Sharon; Buckley, Pamela; Reutebuch, Colleen; Roberts, Greg; Klingner, Janette

    2016-01-01

    Sixty fourth- and fifth-grade general education teachers were randomly assigned to teach Collaborative Strategic Reading (CSR; Klingner, Vaughn, Boardman, & Swanson, 2012), a set of reading comprehension strategies, or to a business-as-usual comparison group. Results demonstrate that students with learning disabilities (LD) who received CSR…

  20. Top 20 Collaborative Internet-Based Science Projects of 1998: Characteristics and Comparisons to Exemplary Science Instruction.

    ERIC Educational Resources Information Center

    Berg, Craig A.; Jefson, Cristy

    This paper utilizes the characteristics of model science instruction to identify exemplary Internet-based science collaborations. The filter for attaining "exemplary" status was based on state and national standards-generating initiatives and the corresponding implications for appropriate student activity in science classrooms. Twenty…

  1. Light transport feature for SCINFUL.

    PubMed

    Etaati, G R; Ghal-Eh, N

    2008-03-01

    An extended version of the scintillator response function prediction code SCINFUL has been developed by incorporating PHOTRACK, a Monte Carlo light transport code. Comparisons of calculated and experimental results for organic scintillators exposed to neutrons show that the extended code improves the predictive capability of SCINFUL.

  2. Metadata Management on the SCEC PetaSHA Project: Helping Users Describe, Discover, Understand, and Use Simulation Data in a Large-Scale Scientific Collaboration

    NASA Astrophysics Data System (ADS)

    Okaya, D.; Deelman, E.; Maechling, P.; Wong-Barnum, M.; Jordan, T. H.; Meyers, D.

    2007-12-01

    Large scientific collaborations, such as the SCEC Petascale Cyberfacility for Physics-based Seismic Hazard Analysis (PetaSHA) Project, involve interactions between many scientists who exchange ideas and research results. These groups must organize, manage, and make accessible their community materials of observational data, derivative (research) results, computational products, and community software. The integration of scientific workflows as a paradigm to solve complex computations provides advantages of efficiency, reliability, repeatability, choices, and ease of use. The underlying resource needed for a scientific workflow to function and create discoverable and exchangeable products is the construction, tracking, and preservation of metadata. In the scientific workflow environment there is a two-tier structure of metadata. Workflow-level metadata and provenance describe operational steps, identity of resources, execution status, and product locations and names. Domain-level metadata essentially define the scientific meaning of data, codes and products. To a large degree the metadata at these two levels are separate. However, between these two levels is a subset of metadata produced at one level but is needed by the other. This crossover metadata suggests that some commonality in metadata handling is needed. SCEC researchers are collaborating with computer scientists at SDSC, the USC Information Sciences Institute, and Carnegie Mellon Univ. in order to perform earthquake science using high-performance computational resources. A primary objective of the "PetaSHA" collaboration is to perform physics-based estimations of strong ground motion associated with real and hypothetical earthquakes located within Southern California. Construction of 3D earth models, earthquake representations, and numerical simulation of seismic waves are key components of these estimations. Scientific workflows are used to orchestrate the sequences of scientific tasks and to access distributed computational facilities such as the NSF TeraGrid. Different types of metadata are produced and captured within the scientific workflows. One workflow within PetaSHA ("Earthworks") performs a linear sequence of tasks with workflow and seismological metadata preserved. Downstream scientific codes ingest these metadata produced by upstream codes. The seismological metadata uses attribute-value pairing in plain text; an identified need is to use more advanced handling methods. Another workflow system within PetaSHA ("Cybershake") involves several complex workflows in order to perform statistical analysis of ground shaking due to thousands of hypothetical but plausible earthquakes. Metadata management has been challenging due to its construction around a number of legacy scientific codes. We describe difficulties arising in the scientific workflow due to the lack of this metadata and suggest corrective steps, which in some cases include the cultural shift of domain science programmers coding for metadata.

  3. Importance of the nature of comparison conditions for testing theory-based interventions: comment on Michie and Prestwich (2010).

    PubMed

    Williams, David M

    2010-09-01

    Comments on the original article 'Are interventions theory-based? Development of a theory coding scheme' by Susan Michie and Andrew Prestwich (see record 2010-00152-001). In their admirable effort to develop a coding scheme for the theoretical contribution of intervention research, Michie and Prestwich rightly point out the importance of the presence of a comparison condition when examining the effect of an intervention on targeted theoretical variables and behavioral outcomes (Table 2, item 15). However, they fail to discuss the critical importance of the nature of the comparison condition. Weaker comparison conditions will yield stronger intervention effects; stronger comparison conditions will yield a stronger science of behavior change. (c) 2010 APA, all rights reserved).

  4. Application of thin-layer Navier-Stokes equations near maximum lift

    NASA Technical Reports Server (NTRS)

    Anderson, W. K.; Thomas, J. L.; Rumsey, C. L.

    1984-01-01

    The flowfield about a NACA 0012 airfoil at a Mach number of 0.3 and Reynolds number of 1 million is computed through an angle of attack range, up to 18 deg, corresponding to conditions up to and beyond the maximum lift coefficient. Results obtained using the compressible thin-layer Navier-Stokes equations are presented as well as results from the compressible Euler equations with and without a viscous coupling procedure. The applicability of each code is assessed and many thin-layer Navier-Stokes benchmark solutions are obtained which can be used for comparison with other codes intended for use at high angles of attack. Reasonable agreement of the Navier-Stokes code with experiment and the viscous-inviscid interaction code is obtained at moderate angles of attack. An unsteady solution is obtained with the thin-layer Navier-Stokes code at the highest angle of attack considered. The maximum lift coefficient is overpredicted, however, in comparison to experimental data, which is attributed to the presence of a laminar separation bubble near the leading edge not modeled in the computations. Two comparisons with experimental data are also presented at a higher Mach number.

  5. A Comparison of Three Navier-Stokes Solvers for Exhaust Nozzle Flowfields

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Yoder, Dennis A.; Debonis, James R.

    1999-01-01

    A comparison of the NPARC, PAB, and WIND (previously known as NASTD) Navier-Stokes solvers is made for two flow cases with turbulent mixing as the dominant flow characteristic, a two-dimensional ejector nozzle and a Mach 1.5 elliptic jet. The objective of the work is to determine if comparable predictions of nozzle flows can be obtained from different Navier-Stokes codes employed in a multiple site research program. A single computational grid was constructed for each of the two flows and used for all of the Navier-Stokes solvers. In addition, similar k-e based turbulence models were employed in each code, and boundary conditions were specified as similarly as possible across the codes. Comparisons of mass flow rates, velocity profiles, and turbulence model quantities are made between the computations and experimental data. The computational cost of obtaining converged solutions with each of the codes is also documented. Results indicate that all of the codes provided similar predictions for the two nozzle flows. Agreement of the Navier-Stokes calculations with experimental data was good for the ejector nozzle. However, for the Mach 1.5 elliptic jet, the calculations were unable to accurately capture the development of the three dimensional elliptic mixing layer.

  6. EG and G and NASA face seal codes comparison

    NASA Technical Reports Server (NTRS)

    Basu, Prit

    1994-01-01

    This viewgraph presentation presents the following results for the example comparison: EG&G code with face deformations suppressed and SPIRALG agree well with each other as well as with the experimental data; 0 rpm stiffness data calculated by EG&G code are about 70-100 percent lower than that by SPIRALG; there is no appreciable difference between 0 rpm and 16,000 rpm stiffness and damping coefficients calculated by SPIRALG; and the film damping above 500 psig calculated by SPIRALG is much higher than the O-Ring secondary seal damping (e.g. 50 lbf.s/in).

  7. Experimental research and comparison of LDPC and RS channel coding in ultraviolet communication systems.

    PubMed

    Wu, Menglong; Han, Dahai; Zhang, Xiang; Zhang, Feng; Zhang, Min; Yue, Guangxin

    2014-03-10

    We have implemented a modified Low-Density Parity-Check (LDPC) codec algorithm in ultraviolet (UV) communication system. Simulations are conducted with measured parameters to evaluate the LDPC-based UV system performance. Moreover, LDPC (960, 480) and RS (18, 10) are implemented and experimented via a non-line-of-sight (NLOS) UV test bed. The experimental results are in agreement with the simulation and suggest that based on the given power and 10(-3)bit error rate (BER), in comparison with an uncoded system, average communication distance increases 32% with RS code, while 78% with LDPC code.

  8. An update on the BQCD Hybrid Monte Carlo program

    NASA Astrophysics Data System (ADS)

    Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk

    2018-03-01

    We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.

  9. Software Development Processes Applied to Computational Icing Simulation

    NASA Technical Reports Server (NTRS)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  10. The Efficacy of Three Learning Methods Collaborative, Context-Based Learning and Traditional, on Learning, Attitude and Behaviour of Undergraduate Nursing Students: Integrating Theory and Practice.

    PubMed

    Hasanpour-Dehkordi, Ali; Solati, Kamal

    2016-04-01

    Communication skills training, responsibility, respect, and self-awareness are important indexes of changing learning behaviours in modern approaches. The aim of this study was to investigate the efficacy of three learning approaches, collaborative, context-based learning (CBL), and traditional, on learning, attitude, and behaviour of undergraduate nursing students. This study was a clinical trial with pretest and post-test of control group. The participants were senior nursing students. The samples were randomly assigned to three groups; CBL, collaborative, and traditional. To gather data a standard questionnaire of students' behaviour and attitude was administered prior to and after the intervention. Also, the rate of learning was investigated by a researcher-developed questionnaire prior to and after the intervention in the three groups. In CBL and collaborative training groups, the mean score of behaviour and attitude increased after the intervention. But no significant association was obtained between the mean scores of behaviour and attitude prior to and after the intervention in the traditional group. However, the mean learning score increased significantly in the CBL, collaborative, and traditional groups after the study in comparison to before the study. Both CBL and collaborative approaches were useful in terms of increased respect, self-awareness, self-evaluation, communication skills and responsibility as well as increased motivation and learning score in comparison to traditional method.

  11. A literature review of learning collaboratives in mental health care: used but untested.

    PubMed

    Nadeem, Erum; Olin, S Serene; Hill, Laura Campbell; Hoagwood, Kimberly Eaton; Horwitz, Sarah McCue

    2014-09-01

    Policy makers have increasingly turned to learning collaboratives (LCs) as a strategy for improving usual care through the dissemination of evidence-based practices. The purpose of this review was to characterize the state of the evidence for use of LCs in mental health care. A systematic search of major academic databases for peer-reviewed articles on LCs in mental health care generated 421 unique articles across a range of disciplines; 28 mental health articles were selected for full-text review, and 20 articles representing 16 distinct studies met criteria for final inclusion. Articles were coded to identify the LC components reported, the focus of the research, and key findings. Most of the articles included assessments of provider- or patient-level variables at baseline and post-LC. Only one study included a comparison condition. LC targets ranged widely, from use of a depression screening tool to implementation of evidence-based treatments. Fourteen crosscutting LC components (for example, in-person learning sessions, phone meetings, data reporting, leadership involvement, and training in quality improvement methods) were identified. The LCs reviewed reported including, on average, seven components, most commonly in-person learning sessions, plan-do-study-act cycles, multidisciplinary quality improvement teams, and data collection for quality improvement. LCs are being used widely in mental health care, although there is minimal evidence of their effectiveness and unclear reporting in regard to specific components. Rigorous observational and controlled research studies on the impact of LCs on targeted provider- and patient-level outcomes are greatly needed.

  12. A virtual community and cyberinfrastructure for collaboration in volcano research and risk mitigation

    NASA Astrophysics Data System (ADS)

    Valentine, G. A.

    2012-12-01

    VHub (short for VolcanoHub, and accessible at vhub.org) is an online platform for collaboration in research and training related to volcanoes, the hazards they pose, and risk mitigation. The underlying concept is to provide a mechanism that enables workers to share information with colleagues around the globe; VHub and similar hub technologies could prove very powerful in collaborating and communicating about circum-Pacific volcanic hazards. Collaboration occurs around several different points: (1) modeling and simulation; (2) data sharing; (3) education and training; (4) volcano observatories; and (5) project-specific groups. VHub promotes modeling and simulation in two ways: (1) some models can be implemented on VHub for online execution. This eliminates the need to download and compile a code on a local computer. VHub can provide a central "warehouse" for such models that should result in broader dissemination. VHub also provides a platform that supports the more complex CFD models by enabling the sharing of code development and problem-solving knowledge, benchmarking datasets, and the development of validation exercises. VHub also provides a platform for sharing of data and datasets. The VHub development team is implementing the iRODS data sharing middleware (see irods.org). iRODS allows a researcher to access data that are located at participating data sources around the world (a "cloud" of data) as if the data were housed in a single virtual database. Education and training is another important use of the VHub platform. Audio-video recordings of seminars, PowerPoint slide sets, and educational simulations are all items that can be placed onto VHub for use by the community or by selected collaborators. An important point is that the "manager" of a given educational resource (or any other resource, such as a dataset or a model) can control the privacy of that resource, ranging from private (only accessible by, and known to, specific collaborators) to completely public. Materials for use in the classroom can be shared via VHub. VHub is a very useful platform for project-specific collaborations. With a group site on VHub where collaborators share documents, datasets, maps, and have ongoing discussions using the discussion board function. VHub is funded by the U.S. National Science Foundation, and is participating in development of larger earth-science cyberinfrastructure initiatives (EarthCube), as well as supporting efforts such as the Global Volcano Model.

  13. Comparison of Space Radiation Calculations from Deterministic and Monte Carlo Transport Codes

    NASA Technical Reports Server (NTRS)

    Adams, J. H.; Lin, Z. W.; Nasser, A. F.; Randeniya, S.; Tripathi, r. K.; Watts, J. W.; Yepes, P.

    2010-01-01

    The presentation outline includes motivation, radiation transport codes being considered, space radiation cases being considered, results for slab geometry, results from spherical geometry, and summary. ///////// main physics in radiation transport codes hzetrn uprop fluka geant4, slab geometry, spe, gcr,

  14. Design implications for task-specific search utilities for retrieval and re-engineering of code

    NASA Astrophysics Data System (ADS)

    Iqbal, Rahat; Grzywaczewski, Adam; Halloran, John; Doctor, Faiyaz; Iqbal, Kashif

    2017-05-01

    The importance of information retrieval systems is unquestionable in the modern society and both individuals as well as enterprises recognise the benefits of being able to find information effectively. Current code-focused information retrieval systems such as Google Code Search, Codeplex or Koders produce results based on specific keywords. However, these systems do not take into account developers' context such as development language, technology framework, goal of the project, project complexity and developer's domain expertise. They also impose additional cognitive burden on users in switching between different interfaces and clicking through to find the relevant code. Hence, they are not used by software developers. In this paper, we discuss how software engineers interact with information and general-purpose information retrieval systems (e.g. Google, Yahoo!) and investigate to what extent domain-specific search and recommendation utilities can be developed in order to support their work-related activities. In order to investigate this, we conducted a user study and found that software engineers followed many identifiable and repeatable work tasks and behaviours. These behaviours can be used to develop implicit relevance feedback-based systems based on the observed retention actions. Moreover, we discuss the implications for the development of task-specific search and collaborative recommendation utilities embedded with the Google standard search engine and Microsoft IntelliSense for retrieval and re-engineering of code. Based on implicit relevance feedback, we have implemented a prototype of the proposed collaborative recommendation system, which was evaluated in a controlled environment simulating the real-world situation of professional software engineers. The evaluation has achieved promising initial results on the precision and recall performance of the system.

  15. Sustaining Open Source Communities through Hackathons - An Example from the ASPECT Community

    NASA Astrophysics Data System (ADS)

    Heister, T.; Hwang, L.; Bangerth, W.; Kellogg, L. H.

    2016-12-01

    The ecosystem surrounding a successful scientific open source software package combines both social and technical aspects. Much thought has been given to the technology side of writing sustainable software for large infrastructure projects and software libraries, but less about building the human capacity to perpetuate scientific software used in computational modeling. One effective format for building capacity is regular multi-day hackathons. Scientific hackathons bring together a group of science domain users and scientific software contributors to make progress on a specific software package. Innovation comes through the chance to work with established and new collaborations. Especially in the domain sciences with small communities, hackathons give geographically distributed scientists an opportunity to connect face-to-face. They foster lively discussions amongst scientists with different expertise, promote new collaborations, and increase transparency in both the technical and scientific aspects of code development. ASPECT is an open source, parallel, extensible finite element code to simulate thermal convection, that began development in 2011 under the Computational Infrastructure for Geodynamics. ASPECT hackathons for the past 3 years have grown the number of authors to >50, training new code maintainers in the process. Hackathons begin with leaders establishing project-specific conventions for development, demonstrating the workflow for code contributions, and reviewing relevant technical skills. Each hackathon expands the developer community. Over 20 scientists add >6,000 lines of code during the >1 week event. Participants grow comfortable contributing to the repository and over half continue to contribute afterwards. A high return rate of participants ensures continuity and stability of the group as well as mentoring for novice members. We hope to build other software communities on this model, but anticipate each to bring their own unique challenges.

  16. STRING: A new drifter for HF radar validation.

    NASA Astrophysics Data System (ADS)

    Rammou, Anna-Maria; Zervakis, Vassilis; Bellomo, Lucio; Kokkini, Zoi; Quentin, Celine; Mantovani, Carlo; Kalampokis, Alkiviadis

    2015-04-01

    High-Frequency radars (HFR) are an effective mean of remotely monitoring sea-surface currents, based on recording the Doppler-shift of radio-waves backscattered on the sea surface. Validation of HFRs' measurements takes place via comparisons either with in-situ Eulerian velocity data (usually obtained by surface current-meters attached on moorings) or to Lagrangian velocity fields (recorded by surface drifters). The most common surface drifter used for this purpose is the CODE-type drifter (Davis, 1985), an industry-standard design to record the vertical average velocity of the upper 1 m layer of the water column. In this work we claim that the observed differences between the HFR-derived velocities and Lagrangian measurements can be attributed not just to the different spatial scales recorded by the above instruments but also due to the fact that while the HFR-derived velocity corresponds to exponentially weighted vertical average of the velocity field from the surface to 1 m depth (Stewart and Joy, 1974) the velocity estimated by the CODE drifters corresponds to boxcar-type weighted vertical average due to the orthogonal shape of the CODE drifters' sails. After analyzing the theoretical behavior of a drifter under the influence of wind and current, we proceed to propose a new design of exponentially-shaped sails for the drogues of CODE-based drifters, so that the HFR-derived velocities and the drifter-based velocities will be directly comparable, regarding the way of vertically averaging the velocity field.The new drifter, codenamed STRING, exhibits identical behavior to the classical CODE design under relatively homogeneous conditions in the upper 1 m layer, however it is expected to follow a significantly different track in conditions of high vertical shear and stratification. Thus, we suggest that the new design is the instrument of choice for validation of HFR installations, as it can be used in all conditions and behaves identically to CODEs when vertical shear is insignificant. Finally, we present results from three experiments using both drifter types in HFR-covered regions of the Eastern Mediterranean. More experiments are planned, incorporating design improvements dictated by the results of the preliminary field tests. This work was held in the framework of the project "Specifically Targeted for Radars INnovative Gauge (STRING)", funded by the Greek-French collaboration programme "Plato".

  17. Comparison of codes assessing galactic cosmic radiation exposure of aircraft crew.

    PubMed

    Bottollier-Depois, J F; Beck, P; Bennett, B; Bennett, L; Bütikofer, R; Clairand, I; Desorgher, L; Dyer, C; Felsberger, E; Flückiger, E; Hands, A; Kindl, P; Latocha, M; Lewis, B; Leuthold, G; Maczka, T; Mares, V; McCall, M J; O'Brien, K; Rollet, S; Rühm, W; Wissmann, F

    2009-10-01

    The assessment of the exposure to cosmic radiation onboard aircraft is one of the preoccupations of bodies responsible for radiation protection. Cosmic particle flux is significantly higher onboard aircraft than at ground level and its intensity depends on the solar activity. The dose is usually estimated using codes validated by the experimental data. In this paper, a comparison of various codes is presented, some of them are used routinely, to assess the dose received by the aircraft crew caused by the galactic cosmic radiation. Results are provided for periods close to solar maximum and minimum and for selected flights covering major commercial routes in the world. The overall agreement between the codes, particularly for those routinely used for aircraft crew dosimetry, was better than +/-20 % from the median in all but two cases. The agreement within the codes is considered to be fully satisfactory for radiation protection purposes.

  18. Secure ADS-B authentication system and method

    NASA Technical Reports Server (NTRS)

    Viggiano, Marc J (Inventor); Valovage, Edward M (Inventor); Samuelson, Kenneth B (Inventor); Hall, Dana L (Inventor)

    2010-01-01

    A secure system for authenticating the identity of ADS-B systems, including: an authenticator, including a unique id generator and a transmitter transmitting the unique id to one or more ADS-B transmitters; one or more ADS-B transmitters, including a receiver receiving the unique id, one or more secure processing stages merging the unique id with the ADS-B transmitter's identification, data and secret key and generating a secure code identification and a transmitter transmitting a response containing the secure code and ADSB transmitter's data to the authenticator; the authenticator including means for independently determining each ADS-B transmitter's secret key, a receiver receiving each ADS-B transmitter's response, one or more secure processing stages merging the unique id, ADS-B transmitter's identification and data and generating a secure code, and comparison processing comparing the authenticator-generated secure code and the ADS-B transmitter-generated secure code and providing an authentication signal based on the comparison result.

  19. Collaboration Expertise in Medicine - No Evidence for Cross-Domain Application from a Memory Retrieval Study.

    PubMed

    Kiesewetter, Jan; Fischer, Frank; Fischer, Martin R

    2016-01-01

    Is there evidence for expertise on collaboration and, if so, is there evidence for cross-domain application? Recall of stimuli was used to measure so-called internal collaboration scripts of novices and experts in two studies. Internal collaboration scripts refer to an individual's knowledge about how to interact with others in a social situation. METHOD— Ten collaboration experts and ten novices of the content domain social science were presented with four pictures of people involved in collaborative activities. The recall texts were coded, distinguishing between superficial and collaboration script information. RESULTS— Experts recalled significantly more collaboration script information (M = 25.20; SD = 5.88) than did novices (M = 13.80; SD = 4.47). Differences in superficial information were not found. Study 2 tested whether the differences found in Study 1 could be replicated. Furthermore, the cross-domain application of internal collaboration scripts was explored. METHOD— Twenty collaboration experts and 20 novices of the content domain medicine were presented with four pictures and four videos of their content domain and a video and picture of another content domain. All stimuli showed collaborative activities typical for the respective content domains. RESULTS— As in Study 1, experts recalled significantly more collaboration script information of their content domain (M = 71.65; SD = 33.23) than did novices (M = 54.25; SD = 15.01). For the novices, no differences were found for the superficial information nor for the retrieval of collaboration script information recalled after the other content domain stimuli. There is evidence for expertise on collaboration in memory tasks. The results show that experts hold substantially more collaboration script information than did novices. Furthermore, the differences between collaboration novices and collaboration experts occurred only in their own content domain, indicating that internal collaboration scripts are not easily stored and retrieved in memory tasks other than in the own content domain.

  20. Collaboration Expertise in Medicine - No Evidence for Cross-Domain Application from a Memory Retrieval Study

    PubMed Central

    Kiesewetter, Jan; Fischer, Frank; Fischer, Martin R.

    2016-01-01

    Background Is there evidence for expertise on collaboration and, if so, is there evidence for cross-domain application? Recall of stimuli was used to measure so-called internal collaboration scripts of novices and experts in two studies. Internal collaboration scripts refer to an individual’s knowledge about how to interact with others in a social situation. Method—Study 1 Ten collaboration experts and ten novices of the content domain social science were presented with four pictures of people involved in collaborative activities. The recall texts were coded, distinguishing between superficial and collaboration script information. Results—Study 1 Experts recalled significantly more collaboration script information (M = 25.20; SD = 5.88) than did novices (M = 13.80; SD = 4.47). Differences in superficial information were not found. Study 2 Study 2 tested whether the differences found in Study 1 could be replicated. Furthermore, the cross-domain application of internal collaboration scripts was explored. Method—Study 2 Twenty collaboration experts and 20 novices of the content domain medicine were presented with four pictures and four videos of their content domain and a video and picture of another content domain. All stimuli showed collaborative activities typical for the respective content domains. Results—Study 2 As in Study 1, experts recalled significantly more collaboration script information of their content domain (M = 71.65; SD = 33.23) than did novices (M = 54.25; SD = 15.01). For the novices, no differences were found for the superficial information nor for the retrieval of collaboration script information recalled after the other content domain stimuli. Discussion There is evidence for expertise on collaboration in memory tasks. The results show that experts hold substantially more collaboration script information than did novices. Furthermore, the differences between collaboration novices and collaboration experts occurred only in their own content domain, indicating that internal collaboration scripts are not easily stored and retrieved in memory tasks other than in the own content domain. PMID:26866801

  1. User's manual for Axisymmetric Diffuser Duct (ADD) code. Volume 1: General ADD code description

    NASA Technical Reports Server (NTRS)

    Anderson, O. L.; Hankins, G. B., Jr.; Edwards, D. E.

    1982-01-01

    This User's Manual contains a complete description of the computer codes known as the AXISYMMETRIC DIFFUSER DUCT code or ADD code. It includes a list of references which describe the formulation of the ADD code and comparisons of calculation with experimental flows. The input/output and general use of the code is described in the first volume. The second volume contains a detailed description of the code including the global structure of the code, list of FORTRAN variables, and descriptions of the subroutines. The third volume contains a detailed description of the CODUCT code which generates coordinate systems for arbitrary axisymmetric ducts.

  2. Peer review in design: Understanding the impact of collaboration on the review process and student perception

    NASA Astrophysics Data System (ADS)

    Mandala, Mahender Arjun

    A cornerstone of design and design education is frequent situated feedback. With increasing class sizes, and shrinking financial and human resources, providing rich feedback to students becomes increasingly difficult. In the field of writing, web-based peer review--the process of utilizing equal status learners within a class to provide feedback to each other on their work using networked computing systems--has been shown to be a reliable and valid source of feedback in addition to improving student learning. Designers communicate in myriad ways, using the many languages of design and combining visual and descriptive information. This complex discourse of design intent makes peer reviews by design students ambiguous and often not helpful to the receivers of this feedback. Furthermore, engaging students in the review process itself is often difficult. Teams can complement individual diversity and may assist novice designers collectively resolve complex task. However, teams often incur production losses and may be impacted by individual biases. In the current work, we look at utilizing a collaborative team of reviewers, working collectively and synchronously, in generating web based peer reviews in a sophomore engineering design class. Students participated in a cross-over design, conducting peer reviews as individuals and collaborative teams in parallel sequences. Raters coded the feedback generated on the basis of their appropriateness and accuracy. Self-report surveys and passive observation of teams conducting reviews captured student opinion on the process, its value, and the contrasting experience they had conducting team and individual reviews. We found team reviews generated better quality feedback in comparison to individual reviews. Furthermore, students preferred conducting reviews in teams, finding the process 'fun' and engaging. We observed several learning benefits of using collaboration in reviewing including improved understanding of the assessment criteria, roles, expectations, and increased team reflection. These results provide insight into how to improve the review process for instructors and researchers, and forms a basis for future research work in this area. With respect to facilitating peer review process in design based classrooms, we also present recommendations for creating effective review system design and implementation in classroom supported by research and practical experience.

  3. Social construction of physical knowledge of shadows: A study of five preoperational children's perceptions, collaborative experiences, and activities across knowledge domains

    NASA Astrophysics Data System (ADS)

    Smith, Amy M.

    The first purpose of this qualitative case study was to understand the process of social construction of physical knowledge of shadows among preoperational thinkers by examining collaborative behaviors that may lead to new knowledge. The second purpose was to understand children's perspectives concerning the connection between social interaction and learning. The study focused on group collaboration and physical knowledge building as they relate to preoperational thought, a phase of cognitive development in early childhood. The case study consisted of five kindergarten children enrolled in a private, laboratory school at a southern, urban university. Across the eight-week data collection period, the children explored shadows through planned activities on 10 occasions and were interviewed three times in a focus group context. Primary methods for collecting data included videotaping the interviews and participant observations. Data were transcribed and coded inductively to discover emerging patterns while relating these patterns to existing constructivist theories. In addition, field notes, artifacts, and interviews with the children's teacher served to verify the findings. The findings revealed four major themes. Firstly, in terms of collaborative learning, children, while exhibiting a focus on the self, were attracted to learning with each other. Secondly, interactions seldom involved dialogic complexity, revealing minimal rationale, even during conflict. Thirdly, negative behaviors, such as tattling and exclusion, and prosocial behaviors, such as helping, were perceived as integral to the success of social construction of knowledge. The children considered each of these moral behaviors from the personal standpoint of how it affected them emotionally and in accomplishing a learning-related task. Fourthly, in terms of knowledge building, the findings indicated children's knowledge of shadows evolved over time as they participated in a developing scientific community. They experimented in partnerships to answer questions, test theories, solve problems, and communicate their growing understandings to others about such topics as objects "blocking" light, the origin of shadows, and the comparison between the concepts of shadow and reflection. The influence of socio-moral-emotional classroom climates on physical knowledge construction is suggested for future research as well as motivation as it relates to learning in early childhood group contexts.

  4. The Bauschinger Effect in Autofrettaged Tubes- A Comparison of Models Including the ASME Code

    DTIC Science & Technology

    1998-06-01

    possible error in Division 3 of Section Vm of the ASME Boiler and Pressure Vessel Code . They show that the empirical method used in the code to...Discussion presented by DP Kendall We appreciate the acknowledgement in the Kendall discussion that Division 3 of Section VIII of the ASME Boiler and Pressure Vessel Code may

  5. Space Radiation Transport Codes: A Comparative Study for Galactic Cosmic Rays Environment

    NASA Astrophysics Data System (ADS)

    Tripathi, Ram; Wilson, John W.; Townsend, Lawrence W.; Gabriel, Tony; Pinsky, Lawrence S.; Slaba, Tony

    For long duration and/or deep space human missions, protection from severe space radiation exposure is a challenging design constraint and may be a potential limiting factor. The space radiation environment consists of galactic cosmic rays (GCR), solar particle events (SPE), trapped radiation, and includes ions of all the known elements over a very broad energy range. These ions penetrate spacecraft materials producing nuclear fragments and secondary particles that damage biological tissues, microelectronic devices, and materials. In deep space missions, where the Earth's magnetic field does not provide protection from space radiation, the GCR environment is significantly enhanced due to the absence of geomagnetic cut-off and is a major component of radiation exposure. Accurate risk assessments critically depend on the accuracy of the input information as well as radiation transport codes used, and so systematic verification of codes is necessary. In this study, comparisons are made between the deterministic code HZETRN2006 and the Monte Carlo codes HETC-HEDS and FLUKA for an aluminum shield followed by a water target exposed to the 1977 solar minimum GCR spectrum. Interaction and transport of high charge ions present in GCR radiation environment provide a more stringent constraint in the comparison of the codes. Dose, dose equivalent and flux spectra are compared; details of the comparisons will be discussed, and conclusions will be drawn for future directions.

  6. Cracking the Behavior Code

    ERIC Educational Resources Information Center

    Rappaport, Nancy; Minahan, Jessica

    2012-01-01

    When, despite their best efforts, teachers feel defeated by a disruptive student, it seems they're fighting a losing battle. These students often have trouble regulating their emotions, become inflexible and have outbursts, and leave teachers feeling exhausted and incompetent. Through their collaboration, the authors have developed an approach…

  7. Numerical Simulations of 3D Seismic Data Final Report CRADA No. TC02095.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedmann, S. J.; Kostov, C.

    This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of Califomia)/Lawrence-Livermore National Laboratory (LLNL) and Schlumberger Cambridge Research (SCR), to develop synthetic seismic data sets and supporting codes.

  8. Ethical Issues in Continuing Professional Education.

    ERIC Educational Resources Information Center

    Lawler, Patricia Ann

    2000-01-01

    Continuing professional education practitioners often face ethical dilemmas regarding their obligations to multiple stakeholders and issues arising in new arenas such as the workplace, distance education, and collaboration with business. Codes of ethics can guide practice, but practitioners should also identify their personal core values system…

  9. Compression performance comparison in low delay real-time video for mobile applications

    NASA Astrophysics Data System (ADS)

    Bivolarski, Lazar

    2012-10-01

    This article compares the performance of several current video coding standards in the conditions of low-delay real-time in a resource constrained environment. The comparison is performed using the same content and the metrics and mix of objective and perceptual quality metrics. The metrics results in different coding schemes are analyzed from a point of view of user perception and quality of service. Multiple standards are compared MPEG-2, MPEG4 and MPEG-AVC and well and H.263. The metrics used in the comparison include SSIM, VQM and DVQ. Subjective evaluation and quality of service are discussed from a point of view of perceptual metrics and their incorporation in the coding scheme development process. The performance and the correlation of results are presented as a predictor of the performance of video compression schemes.

  10. HESS Opinions: Repeatable research: what hydrologistscan learn from the Duke cancer research scandal

    USGS Publications Warehouse

    Fienen, Michael; Bakker, Mark

    2016-01-01

    In the past decade, difficulties encountered in reproducing the results of a cancer study at Duke University resulted in a scandal and an investigation which concluded that tools used for data management, analysis, and modeling were inappropriate for the documentation of the study, let alone the reproduction of the results. New protocols were developed which require that data analysis and modeling be carried out with scripts that can be used to reproduce the results and are a record of all decisions and interpretations made during an analysis or a modeling effort. In the hydrological sciences, we face similar challenges and need to develop similar standards for transparency and repeatability of results. A promising route is to start making use of open-source languages (such as R and Python) to write scripts and to use collaborative coding environments (such as Git) to share our codes for inspection and use by the hydrological community. An important side-benefit to adopting such protocols is consistency and efficiency among collaborators.

  11. Real-time Automatic Search for Multi-wavelength Counterparts of DWF Transients

    NASA Astrophysics Data System (ADS)

    Murphy, Christopher; Cucchiara, Antonino; Andreoni, Igor; Cooke, Jeff; Hegarty, Sarah

    2018-01-01

    The Deeper Wider Faster (DWF) survey aims to find and classify the fastest transients in the Universe. DWF utilizes the Dark Energy Camera (DECam), collecting a continuous sequence of 20s images over a 3 square degree field of view.Once an interesting transient is detected during DWF observations, the DWF collaboration has access to several facilities for rapid follow-up in multiple wavelengths (from gamma to radio).An online web tool has been designed to help with real-time visual classification of possible astrophysical transients in data collected by the DWF observing program. The goal of this project is to create a python-based code to improve the classification process by querying several existing archive databases. Given the DWF transient location and search radius, the developed code will extract a list of possible counterparts and all available information (e.g. magnitude, radio fluxes, distance separation).Thanks to this tool, the human classifier can make a quicker decision in order to trigger the collaboration rapid-response resources.

  12. Comparisons of survival predictions using survival risk ratios based on International Classification of Diseases, Ninth Revision and Abbreviated Injury Scale trauma diagnosis codes.

    PubMed

    Clarke, John R; Ragone, Andrew V; Greenwald, Lloyd

    2005-09-01

    We conducted a comparison of methods for predicting survival using survival risk ratios (SRRs), including new comparisons based on International Classification of Diseases, Ninth Revision (ICD-9) versus Abbreviated Injury Scale (AIS) six-digit codes. From the Pennsylvania trauma center's registry, all direct trauma admissions were collected through June 22, 1999. Patients with no comorbid medical diagnoses and both ICD-9 and AIS injury codes were used for comparisons based on a single set of data. SRRs for ICD-9 and then for AIS diagnostic codes were each calculated two ways: from the survival rate of patients with each diagnosis and when each diagnosis was an isolated diagnosis. Probabilities of survival for the cohort were calculated using each set of SRRs by the multiplicative ICISS method and, where appropriate, the minimum SRR method. These prediction sets were then internally validated against actual survival by the Hosmer-Lemeshow goodness-of-fit statistic. The 41,364 patients had 1,224 different ICD-9 injury diagnoses in 32,261 combinations and 1,263 corresponding AIS injury diagnoses in 31,755 combinations, ranging from 1 to 27 injuries per patient. All conventional ICD-9-based combinations of SRRs and methods had better Hosmer-Lemeshow goodness-of-fit statistic fits than their AIS-based counterparts. The minimum SRR method produced better calibration than the multiplicative methods, presumably because it did not magnify inaccuracies in the SRRs that might occur with multiplication. Predictions of survival based on anatomic injury alone can be performed using ICD-9 codes, with no advantage from extra coding of AIS diagnoses. Predictions based on the single worst SRR were closer to actual outcomes than those based on multiplying SRRs.

  13. An Evaluation of Collaborative Interventions to Improve Chronic Illness Care: Framework and Study Design

    ERIC Educational Resources Information Center

    Cretin, Shan; Shortell, Stephen M.; Keeler, Emmett B.

    2004-01-01

    The authors' dual-purpose evaluation assesses the effectiveness of formal collaboratives in stimulating organizational changes to improve chronic illness care (the chronic care model or CCM). Intervention and comparison sites are compared before and after introduction of the CCM. Multiple data sources are used to measure the degree of…

  14. OVERVIEW OF AN INTERLABORATORY COLLABORATION ON EVALUATING THE EFFECTS OF MODEL HEPATOTOXICANTS ON HEPATIC GENE EXPRESSION

    EPA Science Inventory

    Evaluating the Effects of Methapyrilene and Clofibrate on Hepatic Gene Expression: A Collaboration Between Laboratories and a Comparison of Platform and Analytical Approaches

    Roger G. Ulrich1, John C. Rockett2, G. Gordon Gibson3 and Syril Pettit4

    1 Rosetta Inpharmat...

  15. Current Collection in a Magnetic Field

    NASA Technical Reports Server (NTRS)

    Krivorutsky, E. N.

    1997-01-01

    It is found that the upper-bound limit for current collection in the case of strong magnetic field from the current is close to that given by the Parker-Murphy formula. This conclusion is consistent with the results obtained in laboratory experiments. This limit weakly depends on the shape of the wire. The adiabatic limit in this case will be easily surpassed due to strong magnetic field gradients near the separatrix. The calculations can be done using the kinetic equation in the drift approximation. Analytical results are obtained for the region where the Earth's magnetic field is dominant. The current collection can be calculated (neglecting scattering) using a particle simulation code. Dr. Singh has agreed to collaborate, allowing the use of his particle code. The code can be adapted for the case when the current magnetic field is strong. The needed dm for these modifications is 3-4 months. The analytical description and essential part of the program is prepared for the calculation of the current in the region where the adiabatic description can be used. This was completed with the collaboration of Drs. Khazanov and Liemohn. A scheme of measuring the end body position is also proposed. The scheme was discussed in the laboratory (with Dr. Stone) and it was concluded that it can be proposed for engineering analysis.

  16. Concatenated coding for low date rate space communications.

    NASA Technical Reports Server (NTRS)

    Chen, C. H.

    1972-01-01

    In deep space communications with distant planets, the data rate as well as the operating SNR may be very low. To maintain the error rate also at a very low level, it is necessary to use a sophisticated coding system (longer code) without excessive decoding complexity. The concatenated coding has been shown to meet such requirements in that the error rate decreases exponentially with the overall length of the code while the decoder complexity increases only algebraically. Three methods of concatenating an inner code with an outer code are considered. Performance comparison of the three concatenated codes is made.

  17. Facilitators and Barriers to Preparedness Partnerships: A Veterans Affairs Medical Center Perspective.

    PubMed

    Schmitz, Susan; Wyte-Lake, Tamar; Dobalian, Aram

    2017-09-13

    This study sought to understand facilitators and barriers faced by local US Department of Veterans Affairs Medical Center (VAMC) emergency managers (EMs) when collaborating with non-VA entities. Twelve EMs participated in semi-structured interviews lasting 60 to 90 minutes discussing their collaboration with non-VAMC organizations. Sections of the interview transcripts concerning facilitators and barriers to collaboration were coded and analyzed. Common themes were organized into 2 categories: (1) internal (ie, factors affecting collaboration from within VAMCs or by VA policy) and (2) external (ie, interagency or interpersonal factors). Respondents reported a range of facilitators and barriers to collaboration with community-based agencies. Internal factors facilitating collaboration included items such as leadership support. An internal barrier example included lack of clarity surrounding the VAMC's role in community disaster response. External factors noted as facilitators included a shared goal across organizations while a noted barrier was a perception that potential partners viewed a VAMC partnership with skepticism. Federal institutions are important partners for the success of community disaster preparedness and response. Understanding the barriers that VAMCs confront, as well as potential facilitators to collaboration, should enhance the development of VAMC-community partnerships and improve community health resilience. (Disaster Med Public Health Preparedness. 2017; page 1 of 6).

  18. AWOB: A Collaborative Workbench for Astronomers

    NASA Astrophysics Data System (ADS)

    Kim, J. W.; Lemson, G.; Bulatovic, N.; Makarenko, V.; Vogler, A.; Voges, W.; Yao, Y.; Kiefl, R.; Koychev, S.

    2015-09-01

    We present the Astronomers Workbench (AWOB1), a web-based collaboration and publication platform for a scientific project of any size, developed in collaboration between the Max-Planck institutes of Astrophysics (MPA) and Extra-terrestrial Physics (MPE) and the Max-Planck Digital Library (MPDL). AWOB facilitates the collaboration between geographically distributed astronomers working on a common project throughout its whole scientific life cycle. AWOB does so by making it very easy for scientists to set up and manage a collaborative workspace for individual projects, where data can be uploaded and shared. It supports inviting project collaborators, provides wikis, automated mailing lists, calendars and event notification and has a built in chat facility. It allows the definition and tracking of tasks within projects and supports easy creation of e-publications for the dissemination of data and images and other resources that cannot be added to submitted papers. AWOB extends the project concept to larger scale consortia, within which it is possible to manage working groups and sub-projects. The existing AWOB instance has so far been limited to Max-Planck members and their collaborators, but will be opened to the whole astronomical community. AWOB is an open-source project and its source code is available upon request. We intend to extend AWOB's functionality also to other disciplines, and would greatly appreciate contributions from the community.

  19. Three decades of the WHO code and marketing of infant formulas.

    PubMed

    Forsyth, Stewart

    2012-05-01

    The International Code of Marketing of Breast Milk Substitutes states that governments, non-governmental organizations, experts, consumers and industry need to cooperate in activities aimed at improving infant nutrition. However, the evidence from the last three decades is that of a series of disputes, legal proceedings and boycotts. The purpose of this review is to assess the overall progress in the implementation of the Code and to examine the problematic areas of monitoring, compliance and governance. There are continuing issues of implementation, monitoring and compliance which predominantly reflect weak governance. Many Member States have yet to fully implement the Code recommendations and most States do not have adequate monitoring and reporting mechanisms. Application of the Code in developed countries may be undermined by a lack of consensus on the WHO recommendation of 6 months exclusive breastfeeding. There is evidence of continuing conflict and acrimony, especially between non-government organizations and industry. Measures need to be taken to encourage the Member States to implement the Code and to establish the governance systems that will not only ensure effective implementation and monitoring of the Code, but also deliver the Code within a spirit of participation, collaboration and trust.

  20. Online collaboration and model sharing in volcanology via VHub.org

    NASA Astrophysics Data System (ADS)

    Valentine, G.; Patra, A. K.; Bajo, J. V.; Bursik, M. I.; Calder, E.; Carn, S. A.; Charbonnier, S. J.; Connor, C.; Connor, L.; Courtland, L. M.; Gallo, S.; Jones, M.; Palma Lizana, J. L.; Moore-Russo, D.; Renschler, C. S.; Rose, W. I.

    2013-12-01

    VHub (short for VolcanoHub, and accessible at vhub.org) is an online platform for barrier free access to high end modeling and simulation and collaboration in research and training related to volcanoes, the hazards they pose, and risk mitigation. The underlying concept is to provide a platform, building upon the successful HUBzero software infrastructure (hubzero.org), that enables workers to collaborate online and to easily share information, modeling and analysis tools, and educational materials with colleagues around the globe. Collaboration occurs around several different points: (1) modeling and simulation; (2) data sharing; (3) education and training; (4) volcano observatories; and (5) project-specific groups. VHub promotes modeling and simulation in two ways: (1) some models can be implemented on VHub for online execution. VHub can provide a central warehouse for such models that should result in broader dissemination. VHub also provides a platform that supports the more complex CFD models by enabling the sharing of code development and problem-solving knowledge, benchmarking datasets, and the development of validation exercises. VHub also provides a platform for sharing of data and datasets. The VHub development team is implementing the iRODS data sharing middleware (see irods.org). iRODS allows a researcher to access data that are located at participating data sources around the world (a cloud of data) as if the data were housed in a single virtual database. Projects associated with VHub are also going to introduce the use of data driven workflow tools to support the use of multistage analysis processes where computing and data are integrated for model validation, hazard analysis etc. Audio-video recordings of seminars, PowerPoint slide sets, and educational simulations are all items that can be placed onto VHub for use by the community or by selected collaborators. An important point is that the manager of a given educational resource (or any other resource, such as a dataset or a model) can control the privacy of that resource, ranging from private (only accessible by, and known to, specific collaborators) to completely public. VHub is a very useful platform for project-specific collaborations. With a group site on VHub collaborators share documents, datasets, maps, and have ongoing discussions using the discussion board function. VHub is funded by the U.S. National Science Foundation, and is participating in development of larger earth-science cyberinfrastructure initiatives (EarthCube), as well as supporting efforts such as the Global Volcano Model. Emerging VHub-facilitated efforts include model benchmarking, collaborative code development, and growth in online modeling tools.

  1. Comparing Two Inquiry Professional Development Interventions in Science on Primary Students' Questioning and Other Inquiry Behaviours

    NASA Astrophysics Data System (ADS)

    Nichols, Kim; Burgh, Gilbert; Kennedy, Callie

    2017-02-01

    Developing students' skills to pose and respond to questions and actively engage in inquiry behaviours enables students to problem solve and critically engage with learning and society. The aim of this study was to analyse the impact of providing teachers with an intervention in inquiry pedagogy alongside inquiry science curriculum in comparison to an intervention in non-inquiry pedagogy alongside inquiry science curriculum on student questioning and other inquiry behaviours. Teacher participants in the comparison condition received training in four inquiry-based science units and in collaborative strategic reading. The experimental group, the community of inquiry (COI) condition, received training in facilitating a COI in addition to training in the same four inquiry-based science units. This study involved 227 students and 18 teachers in 9 primary schools across Brisbane, Australia. The teachers were randomly allocated by school to one of the two conditions. The study followed the students across years 6 and 7 and students' discourse during small group activities was recorded, transcribed and coded for verbal inquiry behaviours. In the second year of the study, students in the COI condition demonstrated a significantly higher frequency of procedural and substantive higher-order thinking questions and other inquiry behaviours than those in the comparison condition. Implementing a COI within an inquiry science curriculum develops students' questioning and science inquiry behaviours and allows teachers to foster inquiry skills predicated by the Australian Science Curriculum. Provision of inquiry science curriculum resources alone is not sufficient to promote the questioning and other verbal inquiry behaviours predicated by the Australian Science Curriculum.

  2. Application of MCT Failure Criterion using EFM

    DTIC Science & Technology

    2010-03-26

    because HELIUS:MCT™ does not facilitate this. Attempts have been made to use ABAQUS native thermal expansion model combined in addition to Helius-MCT... ABAQUS using a user defined element subroutine EFM. Comparisons have been made between the analysis results using EFM-MCT code and HELIUS:MCT™ code...using the Element-Failure Method (EFM) in ABAQUS . The EFM-MCT has been implemented in ABAQUS using a user defined element subroutine EFM. Comparisons

  3. Comment on ‘egs_brachy: a versatile and fast Monte Carlo code for brachytherapy’

    NASA Astrophysics Data System (ADS)

    Yegin, Gultekin

    2018-02-01

    In a recent paper (Chamberland et al 2016 Phys. Med. Biol. 61 8214) develop a new Monte Carlo code called egs_brachy for brachytherapy treatments. It is based on EGSnrc, and written in the C++ programming language. In order to benchmark the egs_brachy code, the authors use it in various test case scenarios in which complex geometry conditions exist. Another EGSnrc based brachytherapy dose calculation engine, BrachyDose, is used for dose comparisons. The authors fail to prove that egs_brachy can produce reasonable dose values for brachytherapy sources in a given medium. The dose comparisons in the paper are erroneous and misleading. egs_brachy should not be used in any further research studies unless and until all the potential bugs are fixed in the code.

  4. Special and General Education Biology Teachers Working Together Collaboratively

    NASA Astrophysics Data System (ADS)

    Gagne-Grosso, Melissa

    Collaborative teaching, between special education and general education teachers working together, came about as a result of the No Child Left Behind and Individuals with Disabilities Education Acts. Despite the positive intentions of those acts, teachers are not always ready to teach collaboratively. Guided by the theories of fundamental change and inclusion, this study was based on a lack of understanding about collaborative teaching at 3 high schools. The research questions focused on the benefits, process, and concerns related to collaborative teaching. The perspectives of 4 special education and 8 regular education teachers in 3 urban, public high schools were collected through interviews and observations. Data were analyzed descriptively and inductively using coding, reconstruction, and interpretation of the underlying meanings. The findings revealed that teachers benefitted from being in these classrooms by having a reduced work load and shared responsibility; however, they needed more time for collaboration and modifying instruction, professional development, and stronger support. Students in these classrooms benefitted from social interactions with other students and by getting direct answers to questions. Based on these findings, a professional development training was created based on how collaborative teachers can work together to promote successful learning. This project study can have a positive impact by assisting collaborative teachers with support, communication, strategies for modifications and accommodations, and an enhanced experience, and additionally by improving the academic outcomes for their students.

  5. Qualitative study to conceptualise a model of interprofessional collaboration between pharmacists and general practitioners to support patients' adherence to medication

    PubMed Central

    Rathbone, Adam P; Mansoor, Sarab M; Krass, Ines; Hamrosi, Kim; Aslani, Parisa

    2016-01-01

    Objectives Pharmacists and general practitioners (GPs) face an increasing expectation to collaborate interprofessionally on a number of healthcare issues, including medication non-adherence. This study aimed to propose a model of interprofessional collaboration within the context of identifying and improving medication non-adherence in primary care. Setting Primary care; Sydney, Australia. Participants 3 focus groups were conducted with pharmacists (n=23) and 3 with GPs (n=22) working in primary care. Primary and secondary outcome measures Qualitative investigation of GP and pharmacist interactions with each other, and specifically around supporting their patients’ medication adherence. Audio-recordings were transcribed verbatim and transcripts thematically analysed using a combination of manual and computer coding. Results 3 themes pertaining to interprofessional collaboration were identified (1) frequency, (2) co-collaborators and (3) nature of communication which included 2 subthemes (method of communication and type of communication). While the frequency of interactions was low, the majority were conducted by telephone. Interactions, especially those conducted face-to-face, were positive. Only a few related to patient non-adherence. The findings are positioned within contemporary collaborative theory and provide an accessible introduction to models of interprofessional collaboration. Conclusions This work highlighted that successful collaboration to improve medication adherence was underpinned by shared paradigmatic perspectives and trust, constructed through regular, face-to-face interactions between pharmacists and GPs. PMID:26983948

  6. Collaborative Studies of Polar Cap Ionospheric Dynamics.

    DTIC Science & Technology

    1987-10-12

    AQOIIRISS ICity. State ed Zip Code , 10. SOURCE OF PUNOING NOS. PROGRAM PROJECT TASK WORK .jNir ILE MgtNT NO. NO. NO. NO I TTL fneud ScuryCjMf,4,0...housing and the 3- stage thermoelectric cooler for the image plane detector. The operational principles that govern the application of the instrument are...Force Geophysics Laboratory 6c AOAGS J~iy. Sart A4 Z’P Cdol b. ADDRIESS (City. fE t ad ZIP Code , Anti Arbor, Mic higa n 4819HncmAFB Massachusetts 01731 A

  7. NPS-NRL-Rice-UIUC Collaboration on Navy Atmosphere-Ocean Coupled Models on Many-Core Computer Architectures Annual Report

    DTIC Science & Technology

    2014-09-30

    portability is difficult to achieve on future supercomputers that use various type of accelerators (GPUs, Xeon - Phi , and SIMD etc). All of these...bottlenecks of NUMA. For example, in the CG code the state vector was originally stored as q(1 : Nvar ,1 : Npoin) where Nvar are the number of...a Global Grid Point (GGP) storage. On the other hand, in the DG code the state vector is typically stored as q(1 : Nvar ,1 : Npts,1 : Nelem) where

  8. An improved algorithm for evaluating trellis phase codes

    NASA Technical Reports Server (NTRS)

    Mulligan, M. G.; Wilson, S. G.

    1982-01-01

    A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.

  9. An improved algorithm for evaluating trellis phase codes

    NASA Technical Reports Server (NTRS)

    Mulligan, M. G.; Wilson, S. G.

    1984-01-01

    A method is described for evaluating the minimum distance parameters of trellis phase codes, including CPFSK, partial response FM, and more importantly, coded CPM (continuous phase modulation) schemes. The algorithm provides dramatically faster execution times and lesser memory requirements than previous algorithms. Results of sample calculations and timing comparisons are included.

  10. Lessons Learned from a Cross-Model Validation between a Discrete Event Simulation Model and a Cohort State-Transition Model for Personalized Breast Cancer Treatment.

    PubMed

    Jahn, Beate; Rochau, Ursula; Kurzthaler, Christina; Paulden, Mike; Kluibenschädl, Martina; Arvandi, Marjan; Kühne, Felicitas; Goehler, Alexander; Krahn, Murray D; Siebert, Uwe

    2016-04-01

    Breast cancer is the most common malignancy among women in developed countries. We developed a model (the Oncotyrol breast cancer outcomes model) to evaluate the cost-effectiveness of a 21-gene assay when used in combination with Adjuvant! Online to support personalized decisions about the use of adjuvant chemotherapy. The goal of this study was to perform a cross-model validation. The Oncotyrol model evaluates the 21-gene assay by simulating a hypothetical cohort of 50-year-old women over a lifetime horizon using discrete event simulation. Primary model outcomes were life-years, quality-adjusted life-years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). We followed the International Society for Pharmacoeconomics and Outcomes Research-Society for Medical Decision Making (ISPOR-SMDM) best practice recommendations for validation and compared modeling results of the Oncotyrol model with the state-transition model developed by the Toronto Health Economics and Technology Assessment (THETA) Collaborative. Both models were populated with Canadian THETA model parameters, and outputs were compared. The differences between the models varied among the different validation end points. The smallest relative differences were in costs, and the greatest were in QALYs. All relative differences were less than 1.2%. The cost-effectiveness plane showed that small differences in the model structure can lead to different sets of nondominated test-treatment strategies with different efficiency frontiers. We faced several challenges: distinguishing between differences in outcomes due to different modeling techniques and initial coding errors, defining meaningful differences, and selecting measures and statistics for comparison (means, distributions, multivariate outcomes). Cross-model validation was crucial to identify and correct coding errors and to explain differences in model outcomes. In our comparison, small differences in either QALYs or costs led to changes in ICERs because of changes in the set of dominated and nondominated strategies. © The Author(s) 2015.

  11. Overlooking the obvious: a meta-analytic comparison of digit symbol coding tasks and other cognitive measures in schizophrenia.

    PubMed

    Dickinson, Dwight; Ramsey, Mary E; Gold, James M

    2007-05-01

    In focusing on potentially localizable cognitive impairments, the schizophrenia meta-analytic literature has overlooked the largest single impairment: on digit symbol coding tasks. To compare the magnitude of the schizophrenia impairment on coding tasks with impairments on other traditional neuropsychological instruments. MEDLINE and PsycINFO electronic databases and reference lists from identified articles. English-language studies from 1990 to present, comparing performance of patients with schizophrenia and healthy controls on coding tasks and cognitive measures representing at least 2 other cognitive domains. Of 182 studies identified, 40 met all criteria for inclusion in the meta-analysis. Means, standard deviations, and sample sizes were extracted for digit symbol coding and 36 other cognitive variables. In addition, we recorded potential clinical moderator variables, including chronicity/severity, medication status, age, and education, and potential study design moderators, including coding task variant, matching, and study publication date. Main analyses synthesized data from 37 studies comprising 1961 patients with schizophrenia and 1444 comparison subjects. Combination of mean effect sizes across studies by means of a random effects model yielded a weighted mean effect for digit symbol coding of g = -1.57 (95% confidence interval, -1.66 to -1.48). This effect compared with a grand mean effect of g = -0.98 and was significantly larger than effects for widely used measures of episodic memory, executive functioning, and working memory. Moderator variable analyses indicated that clinical and study design differences between studies had little effect on the coding task effect. Comparison with previous meta-analyses suggested that current results were representative of the broader literature. Subsidiary analysis of data from relatives of patients with schizophrenia also suggested prominent coding task impairments in this group. The 5-minute digit symbol coding task, reliable and easy to administer, taps an information processing inefficiency that is a central feature of the cognitive deficit in schizophrenia and deserves systematic investigation.

  12. Collaborative Research: Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katsouleas, Thomas; Decyk, Viktor

    Final Report for grant DE-FG02-06ER54888, "Simulation of Beam-Electron Cloud Interactions in Circular Accelerators Using Plasma Models" Viktor K. Decyk, University of California, Los Angeles Los Angeles, CA 90095-1547 The primary goal of this collaborative proposal was to modify the code QuickPIC and apply it to study the long-time stability of beam propagation in low density electron clouds present in circular accelerators. The UCLA contribution to this collaborative proposal was in supporting the development of the pipelining scheme for the QuickPIC code, which extended the parallel scaling of this code by two orders of magnitude. The USC work was as describedmore » here the PhD research for Ms. Bing Feng, lead author in reference 2 below, who performed the research at USC under the guidance of the PI Tom Katsouleas and the collaboration of Dr. Decyk The QuickPIC code [1] is a multi-scale Particle-in-Cell (PIC) code. The outer 3D code contains a beam which propagates through a long region of plasma and evolves slowly. The plasma response to this beam is modeled by slices of a 2D plasma code. This plasma response then is fed back to the beam code, and the process repeats. The pipelining is based on the observation that once the beam has passed a 2D slice, its response can be fed back to the beam immediately without waiting for the beam to pass all the other slices. Thus independent blocks of 2D slices from different time steps can be running simultaneously. The major difficulty was when particles at the edges needed to communicate with other blocks. Two versions of the pipelining scheme were developed, for the the full quasi-static code and the other for the basic quasi-static code used by this e-cloud proposal. Details of the pipelining scheme were published in [2]. The new version of QuickPIC was able to run with more than 1,000 processors, and was successfully applied in modeling e-clouds by our collaborators in this proposal [3-8]. Jean-Luc Vay at Lawrence Berkeley National Lab later implemented a similar basic quasistatic scheme including pipelining in the code WARP [9] and found good to very good quantitative agreement between the two codes in modeling e-clouds. References [1] C. Huang, V. K. Decyk, C. Ren, M. Zhou, W. Lu, W. B. Mori, J. H. Cooley, T. M. Antonsen, Jr., and T. Katsouleas, "QUICKPIC: A highly efficient particle-in-cell code for modeling wakefield acceleration in plasmas," J. Computational Phys. 217, 658 (2006). [2] B. Feng, C. Huang, V. K. Decyk, W. B. Mori, P. Muggli, and T. Katsouleas, "Enhancing parallel quasi-static particle-in-cell simulations with a pipelining algorithm," J. Computational Phys, 228, 5430 (2009). [3] C. Huang, V. K. Decyk, M. Zhou, W. Lu, W. B. Mori, J. H. Cooley, T. M. Antonsen, Jr., and B. Feng, T. Katsouleas, J. Vieira, and L. O. Silva, "QUICKPIC: A highly efficient fully parallelized PIC code for plasma-based acceleration," Proc. of the SciDAC 2006 Conf., Denver, Colorado, June, 2006 [Journal of Physics: Conference Series, W. M. Tang, Editor, vol. 46, Institute of Physics, Bristol and Philadelphia, 2006], p. 190. [4] B. Feng, C. Huang, V. Decyk, W. B. Mori, T. Katsouleas, P. Muggli, "Enhancing Plasma Wakefield and E-cloud Simulation Performance Using a Pipelining Algorithm," Proc. 12th Workshop on Advanced Accelerator Concepts, Lake Geneva, WI, July, 2006, p. 201 [AIP Conf. Proceedings, vol. 877, Melville, NY, 2006]. [5] B. Feng, P. Muggli, T. Katsouleas, V. Decyk, C. Huang, and W. Mori, "Long Time Electron Cloud Instability Simulation Using QuickPIC with Pipelining Algorithm," Proc. of the 2007 Particle Accelerator Conference, Albuquerque, NM, June, 2007, p. 3615. [6] B. Feng, C. Huang, V. Decyk, W. B. Mori, G. H. Hoffstaetter, P. Muggli, T. Katsouleas, "Simulation of Electron Cloud Effects on Electron Beam at ERL with Pipelined QuickPIC," Proc. 13th Workshop on Advanced Accelerator Concepts, Santa Cruz, CA, July-August, 2008, p. 340 [AIP Conf. Proceedings, vol. 1086, Melville, NY, 2008]. [7] B. Feng, C. Huang, V. K. Decyk, W. B. Mori, P. Muggli, and T. Katsouleas, "Enhancing parallel quasi-static particle-in-cell simulations with a pipelining algorithm," J. Computational Phys, 228, 5430 (2009). [8] C. Huang, W. An, V. K. Decyk, W. Lu, W. B. Mori, F. S. Tsung, M. Tzoufras, S. Morshed, T. Antonsen, B. Feng, T. Katsouleas, R., A. Fonseca, S. F. Martins, J. Vieira, L. O. Silva, E. Esarey, C. G. R. Geddes, W. P. Leemans, E. Cormier-Michel, J.-L. Vay, D. L. Bruhwiler, B. Cowan, J. R. Cary, and K. Paul, "Recent results and future challenges for large scale particleion- cell simulations of plasma-based accelerator concepts," Proc. of the SciDAC 2009 Conf., San Diego, CA, June, 2009 [Journal of Physics: Conference Series, vol. 180, Institute of Physics, Bristol and Philadelphia, 2009], p. 012005. [9] J.-L. Vay, C. M. Celata, M. A. Furman, G. Penn, M. Venturini, D. P. Grote, and K. G. Sonnad, ?Update on Electron-Cloud Simulations Using the Package WARP-POSINST.? Proc. of the 2009 Particle Accelerator Conference PAC09, Vancouver, Canada, June, 2009, paper FR5RFP078.« less

  13. The Joint Physics Analysis Center: Recent results

    NASA Astrophysics Data System (ADS)

    Fernández-Ramírez, César

    2016-10-01

    We review some of the recent achievements of the Joint Physics Analysis Center, a theoretical collaboration with ties to experimental collaborations, that aims to provide amplitudes suitable for the analysis of the current and forthcoming experimental data on hadron physics. Since its foundation in 2013, the group is focused on hadron spectroscopy in preparation for the forthcoming high statistics and high precision experimental data from BELLEII, BESIII, CLAS12, COMPASS, GlueX, LHCb and (hopefully) PANDA collaborations. So far, we have developed amplitudes for πN scattering, KN scattering, pion and J/ψ photoproduction, two kaon photoproduction and three-body decays of light mesons (η, ω, ϕ). The codes for the amplitudes are available to download from the group web page and can be straightforwardly incorporated to the analysis of the experimental data.

  14. sbtools: A package connecting R to cloud-based data for collaborative online research

    USGS Publications Warehouse

    Winslow, Luke; Chamberlain, Scott; Appling, Alison P.; Read, Jordan S.

    2016-01-01

    The adoption of high-quality tools for collaboration and reproducible research such as R and Github is becoming more common in many research fields. While Github and other version management systems are excellent resources, they were originally designed to handle code and scale poorly to large text-based or binary datasets. A number of scientific data repositories are coming online and are often focused on dataset archival and publication. To handle collaborative workflows using large scientific datasets, there is increasing need to connect cloud-based online data storage to R. In this article, we describe how the new R package sbtools enables direct access to the advanced online data functionality provided by ScienceBase, the U.S. Geological Survey’s online scientific data storage platform.

  15. Systematic Comparison of Photoionized Plasma Codes with Application to Spectroscopic Studies of AGN in X-Rays

    NASA Technical Reports Server (NTRS)

    Mehdipour, M.; Kaastra, J. S.; Kallman, T.

    2016-01-01

    Atomic data and plasma models play a crucial role in the diagnosis and interpretation of astrophysical spectra, thus influencing our understanding of the Universe. In this investigation we present a systematic comparison of the leading photoionization codes to determine how much their intrinsic differences impact X-ray spectroscopic studies of hot plasmas in photoionization equilibrium. We carry out our computations using the Cloudy, SPEX, and XSTAR photoionization codes, and compare their derived thermal and ionization states for various ionizing spectral energy distributions. We examine the resulting absorption-line spectra from these codes for the case of ionized outflows in active galactic nuclei. By comparing the ionic abundances as a function of ionization parameter, we find that on average there is about 30 deviation between the codes in where ionic abundances peak. For H-like to B-like sequence ions alone, this deviation in is smaller at about 10 on average. The comparison of the absorption-line spectra in the X-ray band shows that there is on average about 30 deviation between the codes in the optical depth of the lines produced at log 1 to 2, reducing to about 20 deviation at log 3. We also simulate spectra of the ionized outflows with the current and upcoming high-resolution X-ray spectrometers, on board XMM-Newton, Chandra, Hitomi, and Athena. From these simulations we obtain the deviation on the best-fit model parameters, arising from the use of different photoionization codes, which is about 10 to40. We compare the modeling uncertainties with the observational uncertainties from the simulations. The results highlight the importance of continuous development and enhancement of photoionization codes for the upcoming era of X-ray astronomy with Athena.

  16. Linear and nonlinear verification of gyrokinetic microstability codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, R. V.; Candy, J.; Barnes, M.

    2011-12-15

    Verification of nonlinear microstability codes is a necessary step before comparisons or predictions of turbulent transport in toroidal devices can be justified. By verification we mean demonstrating that a code correctly solves the mathematical model upon which it is based. Some degree of verification can be accomplished indirectly from analytical instability threshold conditions, nonlinear saturation estimates, etc., for relatively simple plasmas. However, verification for experimentally relevant plasma conditions and physics is beyond the realm of analytical treatment and must rely on code-to-code comparisons, i.e., benchmarking. The premise is that the codes are verified for a given problem or set ofmore » parameters if they all agree within a specified tolerance. True verification requires comparisons for a number of plasma conditions, e.g., different devices, discharges, times, and radii. Running the codes and keeping track of linear and nonlinear inputs and results for all conditions could be prohibitive unless there was some degree of automation. We have written software to do just this and have formulated a metric for assessing agreement of nonlinear simulations. We present comparisons, both linear and nonlinear, between the gyrokinetic codes GYRO[J. Candy and R. E. Waltz, J. Comput. Phys. 186, 545 (2003)] and GS2[W. Dorland, F. Jenko, M. Kotschenreuther, and B. N. Rogers, Phys. Rev. Lett. 85, 5579 (2000)]. We do so at the mid-radius for the same discharge as in earlier work [C. Holland, A. E. White, G. R. McKee, M. W. Shafer, J. Candy, R. E. Waltz, L. Schmitz, and G. R. Tynan, Phys. Plasmas 16, 052301 (2009)]. The comparisons include electromagnetic fluctuations, passing and trapped electrons, plasma shaping, one kinetic impurity, and finite Debye-length effects. Results neglecting and including electron collisions (Lorentz model) are presented. We find that the linear frequencies with or without collisions agree well between codes, as do the time averages of the nonlinear fluxes without collisions. With collisions, the differences between the time-averaged fluxes are larger than the uncertainties defined as the oscillations of the fluxes, with the GS2 fluxes consistently larger (or more positive) than those from GYRO. However, the electrostatic fluxes are much smaller than those without collisions (the electromagnetic energy flux is negligible in both cases). In fact, except for the electron energy fluxes, the absolute magnitudes of the differences in fluxes with collisions are the same or smaller than those without. None of the fluxes exhibit large absolute differences between codes. Beyond these results, the specific linear and nonlinear benchmarks proposed here, as well as the underlying methodology, provide the basis for a wide variety of future verification efforts.« less

  17. Review of the 9th NLTE code comparison workshop

    DOE PAGES

    Piron, Robin; Gilleron, Franck; Aglitskiy, Yefim; ...

    2017-02-24

    Here, we review the 9th NLTE code comparison workshop, which was held in the Jussieu campus, Paris, from November 30th to December 4th, 2015. This time, the workshop was mainly focused on a systematic investigation of iron NLTE steady-state kinetics and emissivity, over a broad range of temperature and density. Through these comparisons, topics such as modeling of the dielectronic processes, density effects or the effect of an external radiation field were addressed. The K-shell spectroscopy of iron plasmas was also addressed, notably through the interpretation of tokamak and laser experimental spectra.

  18. Transport methods and interactions for space radiations

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Townsend, Lawrence W.; Schimmerling, Walter S.; Khandelwal, Govind S.; Khan, Ferdous S.; Nealy, John E.; Cucinotta, Francis A.; Simonsen, Lisa C.; Shinn, Judy L.; Norbury, John W.

    1991-01-01

    A review of the program in space radiation protection at the Langley Research Center is given. The relevant Boltzmann equations are given with a discussion of approximation procedures for space applications. The interaction coefficients are related to solution of the many-body Schroedinger equation with nuclear and electromagnetic forces. Various solution techniques are discussed to obtain relevant interaction cross sections with extensive comparison with experiments. Solution techniques for the Boltzmann equations are discussed in detail. Transport computer code validation is discussed through analytical benchmarking, comparison with other codes, comparison with laboratory experiments and measurements in space. Applications to lunar and Mars missions are discussed.

  19. Review of the 9th NLTE code comparison workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piron, Robin; Gilleron, Franck; Aglitskiy, Yefim

    Here, we review the 9th NLTE code comparison workshop, which was held in the Jussieu campus, Paris, from November 30th to December 4th, 2015. This time, the workshop was mainly focused on a systematic investigation of iron NLTE steady-state kinetics and emissivity, over a broad range of temperature and density. Through these comparisons, topics such as modeling of the dielectronic processes, density effects or the effect of an external radiation field were addressed. The K-shell spectroscopy of iron plasmas was also addressed, notably through the interpretation of tokamak and laser experimental spectra.

  20. Review of the 9th NLTE code comparison workshop

    NASA Astrophysics Data System (ADS)

    Piron, R.; Gilleron, F.; Aglitskiy, Y.; Chung, H.-K.; Fontes, C. J.; Hansen, S. B.; Marchuk, O.; Scott, H. A.; Stambulchik, E.; Ralchenko, Yu.

    2017-06-01

    We review the 9th NLTE code comparison workshop, which was held in the Jussieu campus, Paris, from November 30th to December 4th, 2015. This time, the workshop was mainly focused on a systematic investigation of iron NLTE steady-state kinetics and emissivity, over a broad range of temperature and density. Through these comparisons, topics such as modeling of the dielectronic processes, density effects or the effect of an external radiation field were addressed. The K-shell spectroscopy of iron plasmas was also addressed, notably through the interpretation of tokamak and laser experimental spectra.

  1. A study on scientific collaboration and co-authorship patterns in library and information science studies in Iran between 2005 and 2009

    PubMed Central

    Siamaki, Saba; Geraei, Ehsan; Zare- Farashbandi, Firoozeh

    2014-01-01

    Background: Scientific collaboration is among the most important subjects in scientometrics, and many studies have investigated this concept to this day. The goal of the current study is investigation of scientific collaboration and co-authorship patterns of researchers in the field of library and information science in Iran between years 2005 and 2009. Materials and Methods: The current study uses scientometrics method. The statistical population consists of 942 documents published in Iranian library and information science journals between years 2005 and 2009. Collaboration coefficient, collaboration index (CI), and degree of collaboration (DC) were used for data analysis. Findings: The findings showed that among 942 investigated documents, 506 documents (53.70%) was created by one individual researcher and 436 documents (46.30%) were the result of collaboration between two or more researchers. Also, the highest rank of different authorship patterns belonged to National Journal of Librarianship and Information Organization (code H). Conclusion: The average collaboration coefficient for the library and information science researchers in the investigated time frame was 0.23. The closer this coefficient is to 1, the higher is the level of collaboration between authors, and a coefficient near zero shows a tendency to prefer individual articles. The highest collaboration index with an average of 1.92 authors per paper was seen in year 1388. The five year collaboration index in library and information science in Iran was 1.58, and the average degree of collaboration between researchers in the investigated papers was 0.46, which shows that library and information science researchers have a tendency for co-authorship. However, the co-authorship had increased in recent years reaching its highest number in year 1388. The researchers’ collaboration coefficient also shows relative increase between years 1384 and 1388. National Journal of Librarianship and Information Organization has the highest rank among all the investigated journals based on collaboration coefficient, collaboration index (CI), and degree of collaboration (DC). PMID:25250365

  2. A study on scientific collaboration and co-authorship patterns in library and information science studies in Iran between 2005 and 2009.

    PubMed

    Siamaki, Saba; Geraei, Ehsan; Zare-Farashbandi, Firoozeh

    2014-01-01

    Scientific collaboration is among the most important subjects in scientometrics, and many studies have investigated this concept to this day. The goal of the current study is investigation of scientific collaboration and co-authorship patterns of researchers in the field of library and information science in Iran between years 2005 and 2009. The current study uses scientometrics method. The statistical population consists of 942 documents published in Iranian library and information science journals between years 2005 and 2009. Collaboration coefficient, collaboration index (CI), and degree of collaboration (DC) were used for data analysis. The findings showed that among 942 investigated documents, 506 documents (53.70%) was created by one individual researcher and 436 documents (46.30%) were the result of collaboration between two or more researchers. Also, the highest rank of different authorship patterns belonged to National Journal of Librarianship and Information Organization (code H). The average collaboration coefficient for the library and information science researchers in the investigated time frame was 0.23. The closer this coefficient is to 1, the higher is the level of collaboration between authors, and a coefficient near zero shows a tendency to prefer individual articles. The highest collaboration index with an average of 1.92 authors per paper was seen in year 1388. The five year collaboration index in library and information science in Iran was 1.58, and the average degree of collaboration between researchers in the investigated papers was 0.46, which shows that library and information science researchers have a tendency for co-authorship. However, the co-authorship had increased in recent years reaching its highest number in year 1388. The researchers' collaboration coefficient also shows relative increase between years 1384 and 1388. National Journal of Librarianship and Information Organization has the highest rank among all the investigated journals based on collaboration coefficient, collaboration index (CI), and degree of collaboration (DC).

  3. Update to the NASA Lewis Ice Accretion Code LEWICE

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    1994-01-01

    This report is intended as an update to NASA CR-185129 'User's Manual for the NASA Lewis Ice Accretion Prediction Code (LEWICE).' It describes modifications and improvements made to this code as well as changes to the input and output files, interactive input, and graphics output. The comparison of this code to experimental data is shown to have improved as a result of these modifications.

  4. Evaluation of COBRA III-C and SABRE-I (wire wrap version) computational results by comparison with steady-state data from a 19-pin internally guard heated sodium cooled bundle with a six-channel central blockage (THORS bundle 3C). [LMFBR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dearing, J F; Rose, S D; Nelson, W R

    The predicted computational results of two well-known sub-channel analysis codes, COBRA-III-C and SABRE-I (wire wrap version), have been evaluated by comparison with steady state temperature data from the THORS Facility at ORNL. Both codes give good predictions of transverse and axial temperatures when compared with wire wrap thermocouple data. The crossflow velocity profiles predicted by these codes are similar which is encouraging since the wire wrap models are based on different assumptions.

  5. Analysis of thermo-chemical nonequilibrium models for carbon dioxide flows

    NASA Technical Reports Server (NTRS)

    Rock, Stacey G.; Candler, Graham V.; Hornung, Hans G.

    1992-01-01

    The aerothermodynamics of thermochemical nonequilibrium carbon dioxide flows is studied. The chemical kinetics models of McKenzie and Park are implemented in separate three-dimensional computational fluid dynamics codes. The codes incorporate a five-species gas model characterized by a translational-rotational and a vibrational temperature. Solutions are obtained for flow over finite length elliptical and circular cylinders. The computed flowfields are then employed to calculate Mach-Zehnder interferograms for comparison with experimental data. The accuracy of the chemical kinetics models is determined through this comparison. Also, the methodology of the three-dimensional thermochemical nonequilibrium code is verified by the reproduction of the experiments.

  6. Professionals learning together with patients: An exploratory study of a collaborative learning Fellowship programme for healthcare improvement.

    PubMed

    Myron, Rowan; French, Catherine; Sullivan, Paul; Sathyamoorthy, Ganesh; Barlow, James; Pomeroy, Linda

    2018-05-01

    Improving the quality of healthcare involves collaboration between many different stakeholders. Collaborative learning theory suggests that teaching different professional groups alongside each other may enable them to develop skills in how to collaborate effectively, but there is little literature on how this works in practice. Further, though it is recognised that patients play a fundamental role in quality improvement, there are few examples of where they learn together with professionals. To contribute to addressing this gap, we review a collaborative fellowship in Northwest London, designed to build capacity to improve healthcare, which enabled patients and professionals to learn together. Using the lens of collaborative learning, we conducted an exploratory study of six cohorts of the year long programme (71 participants). Data were collected using open text responses from an online survey (n = 31) and semi-structured interviews (n = 34) and analysed using an inductive open coding approach. The collaborative design of the Fellowship, which included bringing multiple perspectives to discussions of real world problems, was valued by participants who reflected on the safe, egalitarian space created by the programme. Participants (healthcare professionals and patients) found this way of learning initially challenging yet ultimately productive. Despite the pedagogical and practical challenges of developing a collaborative programme, this study indicates that opening up previously restricted learning opportunities as widely as possible, to include patients and carers, is an effective mechanism to develop collaborative skills for quality improvement.

  7. National Centers for Environmental Prediction

    Science.gov Websites

    Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Model Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment

  8. Comparison of the LLNL ALE3D and AKTS Thermal Safety Computer Codes for Calculating Times to Explosion in ODTX and STEX Thermal Cookoff Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wemhoff, A P; Burnham, A K

    2006-04-05

    Cross-comparison of the results of two computer codes for the same problem provides a mutual validation of their computational methods. This cross-validation exercise was performed for LLNL's ALE3D code and AKTS's Thermal Safety code, using the thermal ignition of HMX in two standard LLNL cookoff experiments: the One-Dimensional Time to Explosion (ODTX) test and the Scaled Thermal Explosion (STEX) test. The chemical kinetics model used in both codes was the extended Prout-Tompkins model, a relatively new addition to ALE3D. This model was applied using ALE3D's new pseudospecies feature. In addition, an advanced isoconversional kinetic approach was used in the AKTSmore » code. The mathematical constants in the Prout-Tompkins code were calibrated using DSC data from hermetically sealed vessels and the LLNL optimization code Kinetics05. The isoconversional kinetic parameters were optimized using the AKTS Thermokinetics code. We found that the Prout-Tompkins model calculations agree fairly well between the two codes, and the isoconversional kinetic model gives very similar results as the Prout-Tompkins model. We also found that an autocatalytic approach in the beta-delta phase transition model does affect the times to explosion for some conditions, especially STEX-like simulations at ramp rates above 100 C/hr, and further exploration of that effect is warranted.« less

  9. Analytical ice shape predictions for flight in natural icing conditions

    NASA Technical Reports Server (NTRS)

    Berkowitz, Brian M.; Riley, James T.

    1988-01-01

    LEWICE is an analytical ice prediction code that has been evaluated against icing tunnel data, but on a more limited basis against flight data. Ice shapes predicted by LEWICE is compared with experimental ice shapes accreted on the NASA Lewis Icing Research Aircraft. The flight data selected for comparison includes liquid water content recorded using a hot wire device and droplet distribution data from a laser spectrometer; the ice shape is recorded using stereo photography. The main findings are as follows: (1) An equivalent sand grain roughness correlation different from that used for LEWICE tunnel comparisons must be employed to obtain satisfactory results for flight; (2) Using this correlation and making no other changes in the code, the comparisons to ice shapes accreted in flight are in general as good as the comparisons to ice shapes accreted in the tunnel (as in the case of tunnel ice shapes, agreement is least reliable for large glaze ice shapes at high angles of attack); (3) In some cases comparisons can be somewhat improved by utilizing the code so as to take account of the variation of parameters such as liquid water content, which may vary significantly in flight.

  10. Toward a first-principles integrated simulation of tokamak edge plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, C S; Klasky, Scott A; Cummings, Julian

    2008-01-01

    Performance of the ITER is anticipated to be highly sensitive to the edge plasma condition. The edge pedestal in ITER needs to be predicted from an integrated simulation of the necessary firstprinciples, multi-scale physics codes. The mission of the SciDAC Fusion Simulation Project (FSP) Prototype Center for Plasma Edge Simulation (CPES) is to deliver such a code integration framework by (1) building new kinetic codes XGC0 and XGC1, which can simulate the edge pedestal buildup; (2) using and improving the existing MHD codes ELITE, M3D-OMP, M3D-MPP and NIMROD, for study of large-scale edge instabilities called Edge Localized Modes (ELMs); andmore » (3) integrating the codes into a framework using cutting-edge computer science technology. Collaborative effort among physics, computer science, and applied mathematics within CPES has created the first working version of the End-to-end Framework for Fusion Integrated Simulation (EFFIS), which can be used to study the pedestal-ELM cycles.« less

  11. The importance of documenting code, and how you might make yourself do it

    NASA Astrophysics Data System (ADS)

    Tollerud, Erik Jon; Astropy Project

    2016-01-01

    Your science code is awesome. It reduces data, performs some statistical analysis, or models a physical process better than anyone has done before. You wisely decide that it is worth sharing with your student/advisor, research collaboration, or the whole world. But when you send it out, no one seems willing to use it. Why? Most of the time, it's your documentation. You wrote the code for yourself, so you know what every function, procedure, or class is supposed to do. Unfortunately, your users (sometimes including you 6 months later) do not. In this talk, I will describe some of the tools, both technical and psychological, to make that documentation happen (particularly for the Python ecosystem).

  12. The Efficacy of Three Learning Methods Collaborative, Context-Based Learning and Traditional, on Learning, Attitude and Behaviour of Undergraduate Nursing Students: Integrating Theory and Practice

    PubMed Central

    Hasanpour-Dehkordi, Ali

    2016-01-01

    Introduction Communication skills training, responsibility, respect, and self-awareness are important indexes of changing learning behaviours in modern approaches. Aim The aim of this study was to investigate the efficacy of three learning approaches, collaborative, context-based learning (CBL), and traditional, on learning, attitude, and behaviour of undergraduate nursing students. Materials and Methods This study was a clinical trial with pretest and post-test of control group. The participants were senior nursing students. The samples were randomly assigned to three groups; CBL, collaborative, and traditional. To gather data a standard questionnaire of students’ behaviour and attitude was administered prior to and after the intervention. Also, the rate of learning was investigated by a researcher-developed questionnaire prior to and after the intervention in the three groups. Results In CBL and collaborative training groups, the mean score of behaviour and attitude increased after the intervention. But no significant association was obtained between the mean scores of behaviour and attitude prior to and after the intervention in the traditional group. However, the mean learning score increased significantly in the CBL, collaborative, and traditional groups after the study in comparison to before the study. Conclusion Both CBL and collaborative approaches were useful in terms of increased respect, self-awareness, self-evaluation, communication skills and responsibility as well as increased motivation and learning score in comparison to traditional method. PMID:27190926

  13. A Comparison of Preferred Treatment Outcomes between Children with ADHD and Their Parents

    ERIC Educational Resources Information Center

    Traywick, Tracey B.; Lamson, Angela L.; Diamond, John M.; Carawan, Sandra

    2006-01-01

    Objective: The newest guidelines for the treatment of ADHD call for the formation of an individualized treatment plan based on collaboration. Because the process of collaboration requires the communication of desired outcomes, the authors' goal is to examine the preferred outcomes of treatment for ADHD for children and parents. Method: A preferred…

  14. Human-Centred Design Workshops in Collaborative Strategic Design Projects: An Educational and Professional Comparison

    ERIC Educational Resources Information Center

    Liem, Andre; Sanders, Elizabeth B.-N.

    2013-01-01

    It has been found that the implementation of Human-centred Design (HCD) methods in the Fuzzy Front-End is not likely to lead to diversification in educational product planning exercises, where time lines are short and executors lack experience. Companies, interested to collaborate with Master-level Industrial Design students on strategic design…

  15. Positive and Null Effects of Interprofessional Education on Attitudes toward Interprofessional Learning and Collaboration

    ERIC Educational Resources Information Center

    Kenaszchuk, Chris; Rykhoff, Margot; Collins, Laura; McPhail, Stacey; van Soeren, Mary

    2012-01-01

    Interprofessional education (IPE) for health and social care students may improve attitudes toward IPE and interprofessional collaboration (IPC). The quality of research on the association between IPE and attitudes is mediocre and IPE effect sizes are unknown. Students at a college in Toronto, Canada, attended an IPE workshop. A comparison group…

  16. Barriers, Support, and Collaboration: A Comparison of Science and Agriculture Teachers' Perceptions regarding Integration of Science into the Agricultural Education Curriculum

    ERIC Educational Resources Information Center

    Warnick, Brian K.; Thompson, Gregory W.

    2007-01-01

    This study is part of a larger investigation which focused on determining and comparing the perceptions of agriculture teachers and science teachers on integrating science into agricultural education programs. Science and agriculture teachers' perceptions of barriers to integrating science, the support of stakeholders, and collaboration between…

  17. Validation of the SINDA/FLUINT code using several analytical solutions

    NASA Technical Reports Server (NTRS)

    Keller, John R.

    1995-01-01

    The Systems Improved Numerical Differencing Analyzer and Fluid Integrator (SINDA/FLUINT) code has often been used to determine the transient and steady-state response of various thermal and fluid flow networks. While this code is an often used design and analysis tool, the validation of this program has been limited to a few simple studies. For the current study, the SINDA/FLUINT code was compared to four different analytical solutions. The thermal analyzer portion of the code (conduction and radiative heat transfer, SINDA portion) was first compared to two separate solutions. The first comparison examined a semi-infinite slab with a periodic surface temperature boundary condition. Next, a small, uniform temperature object (lumped capacitance) was allowed to radiate to a fixed temperature sink. The fluid portion of the code (FLUINT) was also compared to two different analytical solutions. The first study examined a tank filling process by an ideal gas in which there is both control volume work and heat transfer. The final comparison considered the flow in a pipe joining two infinite reservoirs of pressure. The results of all these studies showed that for the situations examined here, the SINDA/FLUINT code was able to match the results of the analytical solutions.

  18. 10Gbps 2D MGC OCDMA Code over FSO Communication System

    NASA Astrophysics Data System (ADS)

    Professor Urmila Bhanja, Associate, Dr.; Khuntia, Arpita; Alamasety Swati, (Student

    2017-08-01

    Currently, wide bandwidth signal dissemination along with low latency is a leading requisite in various applications. Free space optical wireless communication has introduced as a realistic technology for bridging the gap in present high data transmission fiber connectivity and as a provisional backbone for rapidly deployable wireless communication infrastructure. The manuscript highlights on the implementation of 10Gbps SAC-OCDMA FSO communications using modified two dimensional Golomb code (2D MGC) that possesses better auto correlation, minimum cross correlation and high cardinality. A comparison based on pseudo orthogonal (PSO) matrix code and modified two dimensional Golomb code (2D MGC) is developed in the proposed SAC OCDMA-FSO communication module taking different parameters into account. The simulative outcome signifies that the communication radius is bounded by the multiple access interference (MAI). In this work, a comparison is made in terms of bit error rate (BER), and quality factor (Q) based on modified two dimensional Golomb code (2D MGC) and PSO matrix code. It is observed that the 2D MGC yields better results compared to the PSO matrix code. The simulation results are validated using optisystem version 14.

  19. Support for Debugging Automatically Parallelized Programs

    NASA Technical Reports Server (NTRS)

    Hood, Robert; Jost, Gabriele

    2001-01-01

    This viewgraph presentation provides information on support sources available for the automatic parallelization of computer program. CAPTools, a support tool developed at the University of Greenwich, transforms, with user guidance, existing sequential Fortran code into parallel message passing code. Comparison routines are then run for debugging purposes, in essence, ensuring that the code transformation was accurate.

  20. Comparison of LEWICE 1.6 and LEWICE/NS with IRT experimental data from modern air foil tests

    DOT National Transportation Integrated Search

    1998-01-01

    A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. The most recent release of this code is LEWICE 1.6. This code is modular in ...

  1. A Comparison of Fatigue Design Methods

    DTIC Science & Technology

    2001-04-05

    Boiler and Pressure Vessel Code does not...Engineers, "ASME Boiler and Pressure Vessel Code ," ASME, 3 Park Ave., New York, NY 10016-5990. [4] Langer, B. F., "Design of Pressure Vessels Involving... and Pressure Vessel Code [3] presents these methods and has expanded the procedures to other pressure vessels besides nuclear pressure vessels. B.

  2. Edge-relevant plasma simulations with the continuum code COGENT

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Ghosh, D.; Hittinger, J.; Rognlien, T.; Cohen, R.; Lee, W.; Schwartz, P.

    2016-10-01

    We describe recent advances in cross-separatrix and other edge-relevant plasma simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The distinguishing feature of the COGENT code is its high-order finite-volume discretization methods, which employ arbitrary mapped multiblock grid technology (nearly field-aligned on blocks) to handle the complexity of tokamak divertor geometry with high accuracy. This paper discusses the 4D (axisymmetric) electrostatic version of the code, and the presented topics include: (a) initial simulations with kinetic electrons and development of reduced fluid models; (b) development and application of implicit-explicit (IMEX) time integration schemes; and (c) conservative modeling of drift-waves and the universal instability. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344 and at LBNL under contract DE-AC02-05CH11231.

  3. ASME Code Efforts Supporting HTGRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less

  4. ASME Code Efforts Supporting HTGRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less

  5. ASME Code Efforts Supporting HTGRs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This reportmore » discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.« less

  6. State Code Guides | Efficient Windows Collaborative

    Science.gov Websites

    CO-2012 Connecticut CT-2018 CT-2015 CT-2012 Delaware DE-2018 DE-2015 DE-2012 District of Columbia DC LA-2018 LA-2015 LA-2012 Maine ME-2018 ME-2015 ME-2012 Maryland MD-2018 MD-2015 MD-2012 Massachusetts

  7. Judging the Quality of Peer-Led Student Dialogues.

    ERIC Educational Resources Information Center

    Keefer, Matthew W.; Zeitz, Colleen M.; Resnick, Lauren B.

    2000-01-01

    Compared the rational quality of fourth-graders' discussion of literary texts with an ideal model and over the course of the academic year. Analyzed the collaborative reasoning capabilities of 6 three-student groups using a graphical coding system with an analysis of the literary content of the students' argumentation. Identified important…

  8. National Centers for Environmental Prediction

    Science.gov Websites

    Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Configuration Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log

  9. National Centers for Environmental Prediction

    Science.gov Websites

    Organization Search Enter text Search Navigation Bar End Cap Search EMC Go Branches Global Climate and Weather Modeling Mesoscale Modeling Marine Modeling and Analysis Teams Climate Data Assimilation Ensembles and Post Collaborators Documentation and Code FAQ Operational Change Log Parallel Experiment Change Log Contacts

  10. 77 FR 54917 - Findings of Research Misconduct

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-06

    ... values for inter-observer reliabilities when coding was done by only one observer, in both cases leading... Research Integrity (ORI) has taken final action in the following case: Marc Hauser, Ph.D., Harvard... collaborators that he miscoded some of the trials and that the study failed to provide support for the initial...

  11. Registrar Staging Assistant (SEER*RSA) - SEER

    Cancer.gov

    Use this site for cases diagnosed 2018 and forward to code Extent of Disease 2018, Summary Stage 2018, Site-Specific Data Items, and Grade. Use it for 2016 and 2017 cases to determine UICC TNM 7th edition stage, Collaborative Stage v.02.05.50, and Site-Specific predictive and prognostic factors.

  12. Discovering and Mitigating Software Vulnerabilities through Large-Scale Collaboration

    ERIC Educational Resources Information Center

    Zhao, Mingyi

    2016-01-01

    In today's rapidly digitizing society, people place their trust in a wide range of digital services and systems that deliver latest news, process financial transactions, store sensitive information, etc. However, this trust does not have a solid foundation, because software code that supports this digital world has security vulnerabilities. These…

  13. The Legacy of Apollo: Assessed and Appreciated.

    ERIC Educational Resources Information Center

    Griffin, Richard A.; Griffin, Ann D.

    1997-01-01

    The real-life drama 25 years ago when Apollo 13 was rescued through a collaborative team of colleagues provides a model for changes in many public schools. In Texas, the state code specifies that site-based decision making address planning, budgeting, curriculum staffing patterns, staff development, and school organization. (MLF)

  14. Managing perceived conflicts of interest while ensuring the continued innovation of medical technology.

    PubMed

    Van Haute, Andrew

    2011-09-01

    If it were not for the ongoing collaboration between vascular surgeons and the medical technology industry, many of these advanced treatments used every day in vascular interventional surgery would not exist. The flip side of this coin is that these vital relationships create multiple roles for surgeons and must be appropriately managed. The dynamic process of innovation, along with factors such as product delivery technique refinement, education, testing and clinical trials, and product support, all make it necessary for ongoing and close collaboration between surgeons and the device industry. This unique relationship sometimes leads to the perception of conflicts of interest for physicians, in part because the competing pressures from the multiple, overlapping roles as clinician/caregiver/investigator/innovator/customer are significant. To address this issue, the Advanced Medical Technology Association (AdvaMed), the nation's largest medical technology association representing medical device and diagnostics companies, developed a Code of Ethics to guide medical technology companies in their interactions with health care professionals. First introduced in 1993, the AdvaMed Code strongly encourages both industry and physicians to commit to openness and high ethical standards in the conduct of their business interactions. The AdvaMed Code addresses many of the types of interactions that can occur between companies and health care professionals, including training, consulting agreements, the provision of demonstration and evaluation units, and charitable donations. By following the Code, companies send a strong message that treatment decisions must always be based on the best interest of the patient. Copyright © 2011. Published by Mosby, Inc.

  15. Facilitators, challenges, and collaborative activities in faith and health partnerships to address health disparities.

    PubMed

    Kegler, Michelle C; Hall, Sarah M; Kiser, Mimi

    2010-10-01

    Interest in partnering with faith-based organizations (FBOs) to address health disparities has grown in recent years. Yet relatively little is known about these types of partnerships. As part of an evaluation of the Institute for Faith and Public Health Collaborations, representatives of 34 faith-health teams (n = 61) completed semi-structured interviews. Interviews were tape recorded, transcribed, and coded by two members of the evaluation team to identify themes. Major facilitators to faith-health collaborative work were passion and commitment, importance of FBOs in communities, favorable political climate, support from community and faith leaders, diversity of teams, and mutual trust and respect. Barriers unique to faith and health collaboration included discomfort with FBOs, distrust of either health agencies or FBOs, diversity within faith communities, different agendas, separation of church and state, and the lack of a common language. Findings suggest that faith-health partnerships face unique challenges but are capable of aligning resources to address health disparities.

  16. Collaborative Supercomputing for Global Change Science

    NASA Astrophysics Data System (ADS)

    Nemani, R.; Votava, P.; Michaelis, A.; Melton, F.; Milesi, C.

    2011-03-01

    There is increasing pressure on the science community not only to understand how recent and projected changes in climate will affect Earth's global environment and the natural resources on which society depends but also to design solutions to mitigate or cope with the likely impacts. Responding to this multidimensional challenge requires new tools and research frameworks that assist scientists in collaborating to rapidly investigate complex interdisciplinary science questions of critical societal importance. One such collaborative research framework, within the NASA Earth sciences program, is the NASA Earth Exchange (NEX). NEX combines state-of-the-art supercomputing, Earth system modeling, remote sensing data from NASA and other agencies, and a scientific social networking platform to deliver a complete work environment. In this platform, users can explore and analyze large Earth science data sets, run modeling codes, collaborate on new or existing projects, and share results within or among communities (see Figure S1 in the online supplement to this Eos issue (http://www.agu.org/eos_elec)).

  17. Monitoring Cosmic Radiation Risk: Comparisons between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-01-01

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and...radiation transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the...same dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6

  18. Monitoring Cosmic Radiation Risk: Comparisons Between Observations and Predictive Codes for Naval Aviation

    DTIC Science & Technology

    2009-07-05

    proton PARMA PHITS -based Analytical Radiation Model in the Atmosphere PCAIRE Predictive Code for Aircrew Radiation Exposure PHITS Particle and Heavy...transport code utilized is called PARMA ( PHITS based Analytical Radiation Model in the Atmosphere) [36]. The particle fluxes calculated from the input...dose equivalent coefficient regulations from the ICRP-60 regulations. As a result, the transport codes utilized by EXPACS ( PHITS ) and CARI-6 (PARMA

  19. Comparison of deterministic and stochastic approaches for isotopic concentration and decay heat uncertainty quantification on elementary fission pulse

    NASA Astrophysics Data System (ADS)

    Lahaye, S.; Huynh, T. D.; Tsilanizara, A.

    2016-03-01

    Uncertainty quantification of interest outputs in nuclear fuel cycle is an important issue for nuclear safety, from nuclear facilities to long term deposits. Most of those outputs are functions of the isotopic vector density which is estimated by fuel cycle codes, such as DARWIN/PEPIN2, MENDEL, ORIGEN or FISPACT. CEA code systems DARWIN/PEPIN2 and MENDEL propagate by two different methods the uncertainty from nuclear data inputs to isotopic concentrations and decay heat. This paper shows comparisons between those two codes on a Uranium-235 thermal fission pulse. Effects of nuclear data evaluation's choice (ENDF/B-VII.1, JEFF-3.1.1 and JENDL-2011) is inspected in this paper. All results show good agreement between both codes and methods, ensuring the reliability of both approaches for a given evaluation.

  20. A comparison of cosmological hydrodynamic codes

    NASA Technical Reports Server (NTRS)

    Kang, Hyesung; Ostriker, Jeremiah P.; Cen, Renyue; Ryu, Dongsu; Hernquist, Lars; Evrard, August E.; Bryan, Greg L.; Norman, Michael L.

    1994-01-01

    We present a detailed comparison of the simulation results of various hydrodynamic codes. Starting with identical initial conditions based on the cold dark matter scenario for the growth of structure, with parameters h = 0.5 Omega = Omega(sub b) = 1, and sigma(sub 8) = 1, we integrate from redshift z = 20 to z = O to determine the physical state within a representative volume of size L(exp 3) where L = 64 h(exp -1) Mpc. Five indenpendent codes are compared: three of them Eulerian mesh-based and two variants of the smooth particle hydrodynamics 'SPH' Lagrangian approach. The Eulerian codes were run at N(exp 3) = (32(exp 3), 64(exp 3), 128(exp 3), and 256(exp 3)) cells, the SPH codes at N(exp 3) = 32(exp 3) and 64(exp 3) particles. Results were then rebinned to a 16(exp 3) grid with the exception that the rebinned data should converge, by all techniques, to a common and correct result as N approaches infinity. We find that global averages of various physical quantities do, as expected, tend to converge in the rebinned model, but that uncertainites in even primitive quantities such as (T), (rho(exp 2))(exp 1/2) persists at the 3%-17% level achieve comparable and satisfactory accuracy for comparable computer time in their treatment of the high-density, high-temeprature regions as measured in the rebinned data; the variance among the five codes (at highest resolution) for the mean temperature (as weighted by rho(exp 2) is only 4.5%. Examined at high resolution we suspect that the density resolution is better in the SPH codes and the thermal accuracy in low-density regions better in the Eulerian codes. In the low-density, low-temperature regions the SPH codes have poor accuracy due to statiscal effects, and the Jameson code gives the temperatures which are too high, due to overuse of artificial viscosity in these high Mach number regions. Overall the comparison allows us to better estimate errors; it points to ways of improving this current generation ofhydrodynamic codes and of suiting their use to problems which exploit their best individual features.

  1. Taiwan industrial cooperation program technology transfer for low-level radioactive waste final disposal - phase I.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knowlton, Robert G.; Cochran, John Russell; Arnold, Bill Walter

    2007-01-01

    Sandia National Laboratories and the Institute of Nuclear Energy Research, Taiwan have collaborated in a technology transfer program related to low-level radioactive waste (LLW) disposal in Taiwan. Phase I of this program included regulatory analysis of LLW final disposal, development of LLW disposal performance assessment capabilities, and preliminary performance assessments of two potential disposal sites. Performance objectives were based on regulations in Taiwan and comparisons to those in the United States. Probabilistic performance assessment models were constructed based on limited site data using software including GoldSim, BLT-MS, FEHM, and HELP. These software codes provided the probabilistic framework, container degradation, waste-formmore » leaching, groundwater flow, radionuclide transport, and cover infiltration simulation capabilities in the performance assessment. Preliminary performance assessment analyses were conducted for a near-surface disposal system and a mined cavern disposal system at two representative sites in Taiwan. Results of example calculations indicate peak simulated concentrations to a receptor within a few hundred years of LLW disposal, primarily from highly soluble, non-sorbing radionuclides.« less

  2. Neighboring block based disparity vector derivation for multiview compatible 3D-AVC

    NASA Astrophysics Data System (ADS)

    Kang, Jewon; Chen, Ying; Zhang, Li; Zhao, Xin; Karczewicz, Marta

    2013-09-01

    3D-AVC being developed under Joint Collaborative Team on 3D Video Coding (JCT-3V) significantly outperforms the Multiview Video Coding plus Depth (MVC+D) which simultaneously encodes texture views and depth views with the multiview extension of H.264/AVC (MVC). However, when the 3D-AVC is configured to support multiview compatibility in which texture views are decoded without depth information, the coding performance becomes significantly degraded. The reason is that advanced coding tools incorporated into the 3D-AVC do not perform well due to the lack of a disparity vector converted from the depth information. In this paper, we propose a disparity vector derivation method utilizing only the information of texture views. Motion information of neighboring blocks is used to determine a disparity vector for a macroblock, so that the derived disparity vector is efficiently used for the coding tools in 3D-AVC. The proposed method significantly improves a coding gain of the 3D-AVC in the multiview compatible mode about 20% BD-rate saving in the coded views and 26% BD-rate saving in the synthesized views on average.

  3. Revisiting Fenton Hill Phase I reservoir creation and stimulation mechanisms through the GTO code comparison effort

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengcheng; Mcclure, Mark; Shiozawa, Sogo

    A series of experiments performed at the Fenton Hill hot dry rock site after stage 2 drilling of Phase I reservoir provided intriguing field observations on the reservoir’s responses to injection and venting under various conditions. Two teams participating in the US DOE Geothermal Technologies Office (GTO)’s Code Comparison Study (CCS) used different numerical codes to model these five experiments with the objective of inferring the hydraulic stimulation mechanism involved. The codes used by the two teams are based on different numerical principles, and the assumptions made were also different, due to intrinsic limitations in the codes and the modelers’more » personal interpretations of the field observations. Both sets of models were able to produce the most important field observations and both found that it was the combination of the vertical gradient of the fracture opening pressure, injection volume, and the use/absence of proppant that yielded the different outcomes of the five experiments.« less

  4. The SCEC Community Modeling Environment (SCEC/CME) - An Overview of its Architecture and Current Capabilities

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Jordan, T. H.; Minster, B.; Moore, R.; Kesselman, C.; SCEC ITR Collaboration

    2004-12-01

    The Southern California Earthquake Center (SCEC), in collaboration with the San Diego Supercomputer Center, the USC Information Sciences Institute, the Incorporated Research Institutions for Seismology, and the U.S. Geological Survey, is developing the Southern California Earthquake Center Community Modeling Environment (CME) under a five-year grant from the National Science Foundation's Information Technology Research (ITR) Program jointly funded by the Geosciences and Computer and Information Science & Engineering Directorates. The CME system is an integrated geophysical simulation modeling framework that automates the process of selecting, configuring, and executing models of earthquake systems. During the Project's first three years, we have performed fundamental geophysical and information technology research and have also developed substantial system capabilities, software tools, and data collections that can help scientist perform systems-level earthquake science. The CME system provides collaborative tools to facilitate distributed research and development. These collaborative tools are primarily communication tools, providing researchers with access to information in ways that are convenient and useful. The CME system provides collaborators with access to significant computing and storage resources. The computing resources of the Project include in-house servers, Project allocations on USC High Performance Computing Linux Cluster, as well as allocations on NPACI Supercomputers and the TeraGrid. The CME system provides access to SCEC community geophysical models such as the Community Velocity Model, Community Fault Model, Community Crustal Motion Model, and the Community Block Model. The organizations that develop these models often provide access to them so it is not necessary to use the CME system to access these models. However, in some cases, the CME system supplements the SCEC community models with utility codes that make it easier to use or access these models. In some cases, the CME system also provides alternatives to the SCEC community models. The CME system hosts a collection of community geophysical software codes. These codes include seismic hazard analysis (SHA) programs developed by the SCEC/USGS OpenSHA group. Also, the CME system hosts anelastic wave propagation codes including Kim Olsen's Finite Difference code and Carnegie Mellon's Hercules Finite Element tool chain. The CME system can execute a workflow, that is, a series of geophysical computations using the output of one processing step as the input to a subsequent step. Our workflow capability utilizes grid-based computing software that can submit calculations to a pool of computing resources as well as data management tools that help us maintain an association between data files and metadata descriptions of those files. The CME system maintains, and provides access to, a collection of valuable geophysical data sets. The current CME Digital Library holdings include a collection of 60 ground motion simulation results calculated by a SCEC/PEER working group and a collection of Greens Functions calculated for 33 TriNet broadband receiver sites in the Los Angeles area.

  5. Real-time human collaboration monitoring and intervention

    DOEpatents

    Merkle, Peter B.; Johnson, Curtis M.; Jones, Wendell B.; Yonas, Gerold; Doser, Adele B.; Warner, David J.

    2010-07-13

    A method of and apparatus for monitoring and intervening in, in real time, a collaboration between a plurality of subjects comprising measuring indicia of physiological and cognitive states of each of the plurality of subjects, communicating the indicia to a monitoring computer system, with the monitoring computer system, comparing the indicia with one or more models of previous collaborative performance of one or more of the plurality of subjects, and with the monitoring computer system, employing the results of the comparison to communicate commands or suggestions to one or more of the plurality of subjects.

  6. Collaborative testing as a learning strategy in nursing education.

    PubMed

    Sandahl, Sheryl S

    2010-01-01

    A primary goal of nursing education is to prepare nurses to work collaboratively as members of interprofessional health care teams on behalf of patients. Collaborative testing is a collaborative learning strategy used to foster knowledge development, critical thinking in decision making, and group processing skills. This study incorporated a quasi-experimental design with a comparison group to examine the effect of collaborative testing as a learning strategy on student learning and retention of course content as well as group process skills and student perceptions of their learning and anxiety. The setting was a baccalaureate nursing program; the sample consisted of two groups of senior students enrolled in Medical-Surgical Nursing II. Student learning, as measured by unit examination scores, was greater for students taking examinations collaboratively compared to individually. Retention of course content, as measured by final examination scores, was not greater for students taking examinations collaboratively compared to individually. Student perceptions were overwhelmingly positive, with students reporting increased learning as a result of the collaborative testing experiences. Despite the lack of data to support increased retention, collaborative testing may be a learning strategy worth implementing in nursing education. Students reported more positive interactions and collaboration with their peers, skills required by the professional nurse.

  7. Supporting Emerging Disciplines with e-Communities: Needs and Benefits

    PubMed Central

    Butler, Brian S; Schleyer, Titus K; Weiss, Patricia M; Wang, Xiaoqing; Thyvalikakath, Thankam P; Hatala, Courtney L; Naderi, Reza A

    2008-01-01

    Background Science has developed from a solitary pursuit into a team-based collaborative activity and, more recently, into a multidisciplinary research enterprise. The increasingly collaborative character of science, mandated by complex research questions and problems that require many competencies, requires that researchers lower the barriers to the creation of collaborative networks of experts, such as communities of practice (CoPs). Objectives The aim was to assess the information needs of prospective members of a CoP in an emerging field, dental informatics, and to evaluate their expectations of an e-community in order to design a suitable electronic infrastructure. Methods A Web-based survey instrument was designed and administered to 2768 members of the target audience. Benefit expectations were analyzed for their relationship to (1) the respondents’ willingness to participate in the CoP and (2) their involvement in funded research. Two raters coded the respondents’ answers regarding expected benefits using a 14-category coding scheme (Kappa = 0.834). Results The 256 respondents (11.1% response rate) preferred electronic resources over traditional print material to satisfy their information needs. The most frequently expected benefits from participation in the CoP were general information (85% of respondents), peer networking (31.1%), and identification of potential collaborators and/or research opportunities (23.2%). Conclusions The competitive social-information environment in which CoPs are embedded presents both threats to sustainability and opportunities for greater integration and impact. CoP planners seeking to support the development of emerging biomedical science disciplines should blend information resources, social search and filtering, and visibility mechanisms to provide a portfolio of social and information benefits. Assessing benefit expectations and alternatives provides useful information for CoP planners seeking to prioritize community infrastructure development and encourage participation. PMID:18653443

  8. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice.

    PubMed

    Aarons, Gregory A; Fettes, Danielle L; Hurlburt, Michael S; Palinkas, Lawrence A; Gunderson, Lara; Willging, Cathleen E; Chaffin, Mark J

    2014-01-01

    Implementation and scale-up of evidence-based practices (EBPs) is often portrayed as involving multiple stakeholders collaborating harmoniously in the service of a shared vision. In practice, however, collaboration is a more complex process that may involve shared and competing interests and agendas, and negotiation. The present study examined the scale-up of an EBP across an entire service system using the Interagency Collaborative Team approach. Participants were key stakeholders in a large-scale county-wide implementation of an EBP to reduce child neglect, SafeCare. Semistructured interviews and/or focus groups were conducted with 54 individuals representing diverse constituents in the service system, followed by an iterative approach to coding and analysis of transcripts. The study was conceptualized using the Exploration, Preparation, Implementation, and Sustainment framework. Although community stakeholders eventually coalesced around implementation of SafeCare, several challenges affected the implementation process. These challenges included differing organizational cultures, strategies, and approaches to collaboration; competing priorities across levels of leadership; power struggles; and role ambiguity. Each of the factors identified influenced how stakeholders approached the EBP implementation process. System-wide scale-up of EBPs involves multiple stakeholders operating in a nexus of differing agendas, priorities, leadership styles, and negotiation strategies. The term collaboration may oversimplify the multifaceted nature of the scale-up process. Implementation efforts should openly acknowledge and consider this nexus when individual stakeholders and organizations enter into EBP implementation through collaborative processes.

  9. Collaboration, Negotiation, and Coalescence for Interagency-Collaborative Teams to Scale-up Evidence-Based Practice

    PubMed Central

    Aarons, Gregory A.; Fettes, Danielle; Hurlburt, Michael; Palinkas, Lawrence; Gunderson, Lara; Willging, Cathleen; Chaffin, Mark

    2014-01-01

    Objective Implementation and scale-up of evidence-based practices (EBPs) is often portrayed as involving multiple stakeholders collaborating harmoniously in the service of a shared vision. In practice, however, collaboration is a more complex process that may involve shared and competing interests and agendas, and negotiation. The present study examined the scale-up of an EBP across an entire service system using the Interagency Collaborative Team (ICT) approach. Methods Participants were key stakeholders in a large-scale county-wide implementation of an EBP to reduce child neglect, SafeCare®. Semi-structured interviews and/or focus groups were conducted with 54 individuals representing diverse constituents in the service system, followed by an iterative approach to coding and analysis of transcripts. The study was conceptualized using the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework. Results Although community stakeholders eventually coalesced around implementation of SafeCare, several challenges affected the implementation process. These challenges included differing organizational cultures, strategies, and approaches to collaboration, competing priorities across levels of leadership, power struggles, and role ambiguity. Each of the factors identified influenced how stakeholders approached the EBP implementation process. Conclusions System wide scale-up of EBPs involves multiple stakeholders operating in a nexus of differing agendas, priorities, leadership styles, and negotiation strategies. The term collaboration may oversimplify the multifaceted nature of the scale-up process. Implementation efforts should openly acknowledge and consider this nexus when individual stakeholders and organizations enter into EBP implementation through collaborative processes. PMID:24611580

  10. Understanding Motivators and Challenges to Involving Urban Parents as Collaborators in HIV Prevention Research Efforts.

    PubMed

    McKay, Mary M; Pinto, Rogério M; Bannon, William M; Guilamo-Ramos, Vincent

    2007-01-01

    This study was designed to explore the experiences of urban parents in their role as Collaborative Board members as part of the CHAMP (Collaborative HIV prevention and Adolescent Mental health Project) Family Program Study. The CHAMP Collaborative Board is comprised of urban parents, representatives from schools and community-based agencies and university-based researchers and is charged with overseeing the design, delivery and testing of a family-based HIV prevention program for pre and early adolescent youth. The current qualitative study, guided by the Theory of Unified Behavior Change, is meant to elucidate: (1) pathways to involvement by urban parents; (2) benefits and costs of participating in this collaborative HIV prevention research effort; and (3) the role of social relationships in influencing initial and ongoing participation by parent participants. Twenty-nine parent Collaborative Board members were interviewed for this study. In-depth interviews were audio recorded and ranged from 30 to 90 minutes in length. Transcripts were coded and analyzed using NUD*IST, computerized software used for examining narratives. Findings include community parent members identifying social support and learning opportunities as major reasons for involvement with the Collaborative Board. Prior involvement with other community-based projects and knowledge of at least one other person on the Board also influenced members to join the Board and remain involved over time. Further, recommendations for future collaborative partnerships are made. Findings have direct implication for participatory HIV prevention research activities.

  11. Collaborative WorkBench (cwb): Enabling Experiment Execution, Analysis and Visualization with Increased Scientific Productivity

    NASA Astrophysics Data System (ADS)

    Maskey, Manil; Ramachandran, Rahul; Kuo, Kwo-Sen

    2015-04-01

    The Collaborative WorkBench (CWB) has been successfully developed to support collaborative science algorithm development. It incorporates many features that enable and enhance science collaboration, including the support for both asynchronous and synchronous modes of interactions in collaborations. With the former, members in a team can share a full range of research artifacts, e.g. data, code, visualizations, and even virtual machine images. With the latter, they can engage in dynamic interactions such as notification, instant messaging, file exchange, and, most notably, collaborative programming. CWB also implements behind-the-scene provenance capture as well as version control to relieve scientists of these chores. Furthermore, it has achieved a seamless integration between researchers' local compute environments and those of the Cloud. CWB has also been successfully extended to support instrument verification and validation. Adopted by almost every researcher, the current practice of downloading data to local compute resources for analysis results in much duplication and inefficiency. CWB leverages Cloud infrastructure to provide a central location for data used by an entire science team, thereby eliminating much of this duplication and waste. Furthermore, use of CWB in concert with this same Cloud infrastructure enables co-located analysis with data where opportunities of data-parallelism can be better exploited, thereby further improving efficiency. With its collaboration-enabling features apposite to steps throughout the scientific process, we expect CWB to fundamentally transform research collaboration and realize maximum science productivity.

  12. C3: A Collaborative Web Framework for NASA Earth Exchange

    NASA Astrophysics Data System (ADS)

    Foughty, E.; Fattarsi, C.; Hardoyo, C.; Kluck, D.; Wang, L.; Matthews, B.; Das, K.; Srivastava, A.; Votava, P.; Nemani, R. R.

    2010-12-01

    The NASA Earth Exchange (NEX) is a new collaboration platform for the Earth science community that provides a mechanism for scientific collaboration and knowledge sharing. NEX combines NASA advanced supercomputing resources, Earth system modeling, workflow management, NASA remote sensing data archives, and a collaborative communication platform to deliver a complete work environment in which users can explore and analyze large datasets, run modeling codes, collaborate on new or existing projects, and quickly share results among the Earth science communities. NEX is designed primarily for use by the NASA Earth science community to address scientific grand challenges. The NEX web portal component provides an on-line collaborative environment for sharing of Eearth science models, data, analysis tools and scientific results by researchers. In addition, the NEX portal also serves as a knowledge network that allows researchers to connect and collaborate based on the research they are involved in, specific geographic area of interest, field of study, etc. Features of the NEX web portal include: Member profiles, resource sharing (data sets, algorithms, models, publications), communication tools (commenting, messaging, social tagging), project tools (wikis, blogs) and more. The NEX web portal is built on the proven technologies and policies of DASHlink.arc.nasa.gov, (one of NASA's first science social media websites). The core component of the web portal is a C3 framework, which was built using Django and which is being deployed as a common framework for a number of collaborative sites throughout NASA.

  13. Secondary-to-Tertiary Comparison through the Lens of Ways of Doing Mathematics in Relation to Functions: A Study in Collaboration with Teachers

    ERIC Educational Resources Information Center

    Corriveau, Claudia

    2017-01-01

    This article addresses the issue of transition from secondary to post-secondary education through collaborative research with teachers from both levels. It takes into account implicit elements in this transition. Research on the transition in mathematics education tends to focus more on the tertiary level, studying difficulties encountered by…

  14. Teachers' Perceptions of the Home-School Collaboration: Enhancing Learning for Children with Autism

    ERIC Educational Resources Information Center

    Josilowski, Chana

    2017-01-01

    The topic of this study was the way teachers of children with autism perceive the home-school collaboration and its impact on learning. This research addressed the gap in the literature on the topics of children with autism's performance gap in comparison to their age-equivalent peers. The research question was, "How do teachers of children…

  15. A Comparison of Telecollaborative Classes between Japan and Asian-Pacific Countries--Asian-Pacific Exchange Collaboration (APEC) Project

    ERIC Educational Resources Information Center

    Shimizu, Yoshihiko; Pack, Dwayne; Kano, Mikio; Okazaki, Hiroyuki; Yamamura, Hiroto

    2016-01-01

    The purpose of this report is to compare the effects of "telecollaborative classes" between students in Japan and those in Asian-Pacific countries such as Taiwan, Thailand, and the United States (Hawaii). The telecollaborative classes are part of the Asian-Pacific Exchange Collaboration (APEC) project, a 4-year project involving students…

  16. Comparison of GLIMPS and HFAST Stirling engine code predictions with experimental data

    NASA Technical Reports Server (NTRS)

    Geng, Steven M.; Tew, Roy C.

    1992-01-01

    Predictions from GLIMPS and HFAST design codes are compared with experimental data for the RE-1000 and SPRE free piston Stirling engines. Engine performance and available power loss predictions are compared. Differences exist between GLIMPS and HFAST loss predictions. Both codes require engine specific calibration to bring predictions and experimental data into agreement.

  17. Validation of the Electromagnetic Code FACETS for Numerical Simulation of Radar Target Images

    DTIC Science & Technology

    2009-12-01

    Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong...Validation of the electromagnetic code FACETS for numerical simulation of radar target images S. Wong DRDC Ottawa...for simulating radar images of a target is obtained, through direct simulation-to-measurement comparisons. A 3-dimensional computer-aided design

  18. Creating Catalytic Collaborations between Theater Artists, Scientists, and Research Institutions

    NASA Astrophysics Data System (ADS)

    Wise, Debra

    2012-02-01

    Catalyst Collaborative@MIT (CC@MIT) is a collaboration between MIT and Underground Railway Theater (URT), a company with 30 years experience creating theater through interdisciplinary inquiry and engaging community. CC@MIT is dedicated to creating and presenting plays that deepen public understanding about science, while simultaneously providing artistic and emotional experiences not available in other forms of dialogue about science. CC@MIT engages audiences in thinking about themes in science of social and ethical concern; provides insight into the culture of science and the impact of that culture on society; and examines the human condition through the lens of science that intersects our lives and the lives of scientists. Original productions range from Einstein's Dreams to From Orchids to Octopi -- an evolutionary love story; classics re-framed include The Life of Galileo and Breaking the Code (about Alan Turing). CC@MIT commissions playwrights and scientists to create plays; engages audiences with scientists; performs at MIT and a professional venue near the campus; collaborates with the Cambridge Science Festival and MIT Museum; engages MIT students, as well as youth and children. Artistic Director Debra Wise will address how the collaboration developed, what opportunities are provided by collaborations between theaters and scientific research institutions, and lessons learned of value to the field.

  19. An Evolving Ecosystem for Natural Language Processing in Department of Veterans Affairs.

    PubMed

    Garvin, Jennifer H; Kalsy, Megha; Brandt, Cynthia; Luther, Stephen L; Divita, Guy; Coronado, Gregory; Redd, Doug; Christensen, Carrie; Hill, Brent; Kelly, Natalie; Treitler, Qing Zeng

    2017-02-01

    In an ideal clinical Natural Language Processing (NLP) ecosystem, researchers and developers would be able to collaborate with others, undertake validation of NLP systems, components, and related resources, and disseminate them. We captured requirements and formative evaluation data from the Veterans Affairs (VA) Clinical NLP Ecosystem stakeholders using semi-structured interviews and meeting discussions. We developed a coding rubric to code interviews. We assessed inter-coder reliability using percent agreement and the kappa statistic. We undertook 15 interviews and held two workshop discussions. The main areas of requirements related to; design and functionality, resources, and information. Stakeholders also confirmed the vision of the second generation of the Ecosystem and recommendations included; adding mechanisms to better understand terms, measuring collaboration to demonstrate value, and datasets/tools to navigate spelling errors with consumer language, among others. Stakeholders also recommended capability to: communicate with developers working on the next version of the VA electronic health record (VistA Evolution), provide a mechanism to automatically monitor download of tools and to automatically provide a summary of the downloads to Ecosystem contributors and funders. After three rounds of coding and discussion, we determined the percent agreement of two coders to be 97.2% and the kappa to be 0.7851. The vision of the VA Clinical NLP Ecosystem met stakeholder needs. Interviews and discussion provided key requirements that inform the design of the VA Clinical NLP Ecosystem.

  20. Technology Innovation for the CTBT, the National Laboratory Contribution

    NASA Astrophysics Data System (ADS)

    Goldstein, W. H.

    2016-12-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT) and its Protocol are the result of a long history of scientific engagement and international technical collaboration. The U.S. Department of Energy National Laboratories have been conducting nuclear explosive test-ban research for over 50 years and have made significant contributions to this legacy. Recent examples include the RSTT (regional seismic travel time) computer code and the Smart Sampler—both of these products are the result of collaborations among Livermore, Sandia, Los Alamos, and Pacific Northwest National Laboratories. The RSTT code enables fast and accurate seismic event locations using regional data. This code solves the long-standing problem of using teleseismic and regional seismic data together to locate events. The Smart Sampler is designed for use in On-site Inspections to sample soil gases to look for noble gas fission products from a potential underground nuclear explosive test. The Smart Sampler solves the long-standing problem of collecting soil gases without contaminating the sample with gases from the atmosphere by operating only during atmospheric low-pressure events. Both these products are being evaluated by the Preparatory Commission for the CTBT Organization and the international community. In addition to R&D, the National Laboratories provide experts to support U.S. policy makers in ongoing discussions such as CTBT Working Group B, which sets policy for the development of the CTBT monitoring and verification regime.

  1. Grid-Adapted FUN3D Computations for the Second High Lift Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, E. M.; Rumsey, C. L.; Park, M. A.

    2014-01-01

    Contributions of the unstructured Reynolds-averaged Navier-Stokes code FUN3D to the 2nd AIAA CFD High Lift Prediction Workshop are described, and detailed comparisons are made with experimental data. Using workshop-supplied grids, results for the clean wing configuration are compared with results from the structured code CFL3D Using the same turbulence model, both codes compare reasonably well in terms of total forces and moments, and the maximum lift is similarly over-predicted for both codes compared to experiment. By including more representative geometry features such as slat and flap brackets and slat pressure tube bundles, FUN3D captures the general effects of the Reynolds number variation, but under-predicts maximum lift on workshop-supplied grids in comparison with the experimental data, due to excessive separation. However, when output-based, off-body grid adaptation in FUN3D is employed, results improve considerably. In particular, when the geometry includes both brackets and the pressure tube bundles, grid adaptation results in a more accurate prediction of lift near stall in comparison with the wind-tunnel data. Furthermore, a rotation-corrected turbulence model shows improved pressure predictions on the outboard span when using adapted grids.

  2. Retrieval of tropospheric profiles from IR emission spectra: preliminary results with the DBIS

    NASA Astrophysics Data System (ADS)

    Theriault, Jean-Marc; Anderson, Gail P.; Chetwynd, James H., Jr.; Murphy, Randall E.; Turner, Vernon; Cloutier, M.; Smith, A.; Moncet, Jean-Luc

    1993-11-01

    Recently, Smith and collaborators from University of Wisconsin-Madison have clearly established the possibilities of sounding tropospheric temperature and water vapor profiles with a ground-based uplooking interferometer. With the same perspective but for somewhat different applications, the Defence Research Establishment Valcartier (DREV) has initiated a project with the aim of exploring the many possible avenues of similar approaches. DREV, in collaboration with BOMEM (Quebec, Canada), has developed an instrument referred to as the Double Beam Interferometer Sounder (DBIS). This sounder has been conceived to match the needs encountered in many remote sensing scenarios: slant path capability, small field of view, very wide spectral coverage, and high spectral resolution. Preliminary tests with the DBIS have shown sufficient accuracy for remote sensing applications. In a series of field measurements, jointly organized by the Geophysics Directorate/PL, Hanscom AFB, and DREV, the instrument has been run in a wide variety of sky conditions. Several atmospheric emission spectra recorded with the sounder have been compared to calculations with FASCODE and MODTRAN models. The quality of measurement-model comparisons has prompted the development of an inversion algorithm based on these codes. The purpose of this paper is to report the recent progress achieved in this research. First, the design and operation of the instrument are reviewed. Second, recent field measurements of atmospheric emission spectra are analyzed and compared to models predictions. Finally, the simultaneous retrieval approach selected for the inversion of DBIS spectra to obtain temperature and water vapor profiles is described and preliminary results are presented.

  3. Predictions of Supersonic Jet Mixing and Shock-Associated Noise Compared With Measured Far-Field Data

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    2010-01-01

    Codes for predicting supersonic jet mixing and broadband shock-associated noise were assessed using a database containing noise measurements of a jet issuing from a convergent nozzle. Two types of codes were used to make predictions. Fast running codes containing empirical models were used to compute both the mixing noise component and the shock-associated noise component of the jet noise spectrum. One Reynolds-averaged, Navier-Stokes-based code was used to compute only the shock-associated noise. To enable the comparisons of the predicted component spectra with data, the measured total jet noise spectra were separated into mixing noise and shock-associated noise components. Comparisons were made for 1/3-octave spectra and some power spectral densities using data from jets operating at 24 conditions covering essentially 6 fully expanded Mach numbers with 4 total temperature ratios.

  4. Digital data for quick response (QR) codes of alkalophilic Bacillus pumilus to identify and to compare bacilli isolated from Lonar Crator Lake, India.

    PubMed

    Rekadwad, Bhagwan N; Khobragade, Chandrahasya N

    2016-06-01

    Microbiologists are routinely engaged isolation, identification and comparison of isolated bacteria for their novelty. 16S rRNA sequences of Bacillus pumilus were retrieved from NCBI repository and generated QR codes for sequences (FASTA format and full Gene Bank information). 16SrRNA were used to generate quick response (QR) codes of Bacillus pumilus isolated from Lonar Crator Lake (19° 58' N; 76° 31' E), India. Bacillus pumilus 16S rRNA gene sequences were used to generate CGR, FCGR and PCA. These can be used for visual comparison and evaluation respectively. The hyperlinked QR codes, CGR, FCGR and PCA of all the isolates are made available to the users on a portal https://sites.google.com/site/bhagwanrekadwad/. This generated digital data helps to evaluate and compare any Bacillus pumilus strain, minimizes laboratory efforts and avoid misinterpretation of the species.

  5. The influence of multiple trials and computer-mediated communication on collaborative and individual semantic recall.

    PubMed

    Hinds, Joanne M; Payne, Stephen J

    2018-04-01

    Collaborative inhibition is a phenomenon where collaborating groups experience a decrement in recall when interacting with others. Despite this, collaboration has been found to improve subsequent individual recall. We explore these effects in semantic recall, which is seldom studied in collaborative retrieval. We also examine "parallel CMC", a synchronous form of computer-mediated communication that has previously been found to improve collaborative recall [Hinds, J. M., & Payne, S. J. (2016). Collaborative inhibition and semantic recall: Improving collaboration through computer-mediated communication. Applied Cognitive Psychology, 30(4), 554-565]. Sixty three triads completed a semantic recall task, which involved generating words beginning with "PO" or "HE" across three recall trials, in one of three retrieval conditions: Individual-Individual-Individual (III), Face-to-face-Face-to-Face-Individual (FFI) and Parallel-Parallel-Individual (PPI). Collaborative inhibition was present across both collaborative conditions. Individual recall in Recall 3 was higher when participants had previously collaborated in comparison to recalling three times individually. There was no difference between face-to-face and parallel CMC recall, however subsidiary analyses of instance repetitions and subjective organisation highlighted differences in group members' approaches to recall in terms of organisation and attention to others' contributions. We discuss the implications of these findings in relation to retrieval strategy disruption.

  6. Lessons learned from new construction utility demand side management programs and their implications for implementing building energy codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wise, B.K.; Hughes, K.R.; Danko, S.L.

    1994-07-01

    This report was prepared for the US Department of Energy (DOE) Office of Codes and Standards by the Pacific Northwest Laboratory (PNL) through its Building Energy Standards Program (BESP). The purpose of this task was to identify demand-side management (DSM) strategies for new construction that utilities have adopted or developed to promote energy-efficient design and construction. PNL conducted a survey of utilities and used the information gathered to extrapolate lessons learned and to identify evolving trends in utility new-construction DSM programs. The ultimate goal of the task is to identify opportunities where states might work collaboratively with utilities to promotemore » the adoption, implementation, and enforcement of energy-efficient building energy codes.« less

  7. FastDart : a fast, accurate and friendly version of DART code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rest, J.; Taboada, H.

    2000-11-08

    A new enhanced, visual version of DART code is presented. DART is a mechanistic model based code, developed for the performance calculation and assessment of aluminum dispersion fuel. Major issues of this new version are the development of a new, time saving calculation routine, able to be run on PC, a friendly visual input interface and a plotting facility. This version, available for silicide and U-Mo fuels,adds to the classical accuracy of DART models for fuel performance prediction, a faster execution and visual interfaces. It is part of a collaboration agreement between ANL and CNEA in the area of Lowmore » Enriched Uranium Advanced Fuels, held by the Implementation Arrangement for Technical Exchange and Cooperation in the Area of Peaceful Uses of Nuclear Energy.« less

  8. Understanding Accretion Disks through Three Dimensional Radiation MHD Simulations

    NASA Astrophysics Data System (ADS)

    Jiang, Yan-Fei

    I study the structures and thermal properties of black hole accretion disks in the radiation pressure dominated regime. Angular momentum transfer in the disk is provided by the turbulence generated by the magneto-rotational instability (MRI), which is calculated self-consistently with a recently developed 3D radiation magneto-hydrodynamics (MHD) code based on Athena. This code, developed by my collaborators and myself, couples both the radiation momentum and energy source terms with the ideal MHD equations by modifying the standard Godunov method to handle the stiff radiation source terms. We solve the two momentum equations of the radiation transfer equations with a variable Eddington tensor (VET), which is calculated with a time independent short characteristic module. This code is well tested and accurate in both optically thin and optically thick regimes. It is also accurate for both radiation pressure and gas pressure dominated flows. With this code, I find that when photon viscosity becomes significant, the ratio between Maxwell stress and Reynolds stress from the MRI turbulence can increase significantly with radiation pressure. The thermal instability of the radiation pressure dominated disk is then studied with vertically stratified shearing box simulations. Unlike the previous results claiming that the radiation pressure dominated disk with MRI turbulence can reach a steady state without showing any unstable behavior, I find that the radiation pressure dominated disks always either collapse or expand until we have to stop the simulations. During the thermal runaway, the heating and cooling rates from the simulations are consistent with the general criterion of thermal instability. However, details of the thermal runaway are different from the predictions of the standard alpha disk model, as many assumptions in that model are not satisfied in the simulations. We also identify the key reasons why previous simulations do not find the instability. The thermal instability has many important implications for understanding the observations of both X-ray binaries and Active Galactic Nuclei (AGNs). However, direct comparisons between observations and the simulations require global radiation MHD simulations, which will be the main focus of my future work.

  9. Potential end-to-end imaging information rate advantages of various alternative communication systems

    NASA Technical Reports Server (NTRS)

    Rice, R. F.

    1978-01-01

    Various communication systems were considered which are required to transmit both imaging and a typically error sensitive, class of data called general science/engineering (gse) over a Gaussian channel. The approach jointly treats the imaging and gse transmission problems, allowing comparisons of systems which include various channel coding and data compression alternatives. Actual system comparisons include an Advanced Imaging Communication System (AICS) which exhibits the rather significant potential advantages of sophisticated data compression coupled with powerful yet practical channel coding.

  10. BRISK--research-oriented storage kit for biology-related data.

    PubMed

    Tan, Alan; Tripp, Ben; Daley, Denise

    2011-09-01

    In genetic science, large-scale international research collaborations represent a growing trend. These collaborations have demanding and challenging database, storage, retrieval and communication needs. These studies typically involve demographic and clinical data, in addition to the results from numerous genomic studies (omics studies) such as gene expression, eQTL, genome-wide association and methylation studies, which present numerous challenges, thus the need for data integration platforms that can handle these complex data structures. Inefficient methods of data transfer and access control still plague research collaboration. As science becomes more and more collaborative in nature, the need for a system that adequately manages data sharing becomes paramount. Biology-Related Information Storage Kit (BRISK) is a package of several web-based data management tools that provide a cohesive data integration and management platform. It was specifically designed to provide the architecture necessary to promote collaboration and expedite data sharing between scientists. The software, documentation, Java source code and demo are available at http://genapha.icapture.ubc.ca/brisk/index.jsp. BRISK was developed in Java, and tested on an Apache Tomcat 6 server with a MySQL database. denise.daley@hli.ubc.ca.

  11. Task control and cognitive abilities of self and spouse in collaboration in middle-aged and older couples.

    PubMed

    Berg, Cynthia A; Smith, Timothy W; Ko, Kelly J; Beveridge, Ryan M; Story, Nathan; Henry, Nancy J M; Florsheim, Paul; Pearce, Gale; Uchino, Bert N; Skinner, Michelle A; Glazer, Kelly

    2007-09-01

    Collaborative problem solving may be used by older couples to optimize cognitive functioning, with some suggestion that older couples exhibit greater collaborative expertise. The study explored age differences in 2 aspects of collaborative expertise: spouses' knowledge of their own and their spouse's cognitive abilities and the ability to fit task control to these cognitive abilities. The participants were 300 middle-aged and older couples who completed a hypothetical errand task. The interactions were coded for control asserted by husbands and wives. Fluid intelligence was assessed, and spouses rated their own and their spouse's cognitive abilities. The results revealed no age differences in couple expertise, either in the ability to predict their own and their spouse's cognitive abilities or in the ability to fit task control to abilities. However, gender differences were found. Women fit task control to their own and their spouse's cognitive abilities; men only fit task control to their spouse's cognitive abilities. For women only, the fit between control and abilities was associated with better performance. The results indicate no age differences in couple expertise but point to gender as a factor in optimal collaboration. (PsycINFO Database Record (c) 2007 APA, all rights reserved).

  12. Product code optimization for determinate state LDPC decoding in robust image transmission.

    PubMed

    Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G

    2006-08-01

    We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission.

  13. Cracking Her Codes: Understanding Shared Technology Resources as Positioning Artifacts for Power and Status in CSCL Environments

    ERIC Educational Resources Information Center

    Simpson, Amber; Bannister, Nicole; Matthews, Gretchen

    2017-01-01

    There is a positive relationship between student participation in computer-supported collaborative learning (CSCL) environments and improved complex problem-solving strategies, increased learning gains, higher engagement in the thinking of their peers, and an enthusiastic disposition toward groupwork. However, student participation varies from…

  14. 3 CFR 8967 - Proclamation 8967 of April 30, 2013. National Building Safety Month, 2013

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... States of America A Proclamation When natural disasters and other hazards put American lives at risk... redevelopment. To get involved, visit www.Ready.gov. Time and again, devastating natural disasters have tested... stakeholders across our country to adopt disaster-resistant building codes and standards. We are collaborating...

  15. Dancing Robots: Integrating Art, Music, and Robotics in Singapore's Early Childhood Centers

    ERIC Educational Resources Information Center

    Sullivan, Amanda; Bers, Marina Umaschi

    2018-01-01

    In recent years, Singapore has increased its national emphasis on technology and engineering in early childhood education. Their newest initiative, the Playmaker Programme, has focused on teaching robotics and coding in preschool settings. Robotics offers a playful and collaborative way for children to engage with foundational technology and…

  16. Negotiation of Meaning and Codeswitching in Online Tandems.

    ERIC Educational Resources Information Center

    Kotter, Markus

    2003-01-01

    Analyzes negotiation of meaning and code switching in discourse between 29 language students from classes at a German and a North American university, who teamed up with their peers to collaborate on projects whose results hey had to present to the other groups in the MOO during the final weeks of the project. (VWL)

  17. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  18. Monte Carlo calculation of dose rate conversion factors for external exposure to photon emitters in soil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clovas, A.; Zanthos, S.; Antonopoulos-Domis, M.

    2000-03-01

    The dose rate conversion factors {dot D}{sub CF} (absorbed dose rate in air per unit activity per unit of soil mass, nGy h{sup {minus}1} per Bq kg{sup {minus}1}) are calculated 1 m above ground for photon emitters of natural radionuclides uniformly distributed in the soil. Three Monte Carlo codes are used: (1) The MCNP code of Los Alamos; (2) The GEANT code of CERN; and (3) a Monte Carlo code developed in the Nuclear Technology Laboratory of the Aristotle University of Thessaloniki. The accuracy of the Monte Carlo results is tested by the comparison of the unscattered flux obtained bymore » the three Monte Carlo codes with an independent straightforward calculation. All codes and particularly the MCNP calculate accurately the absorbed dose rate in air due to the unscattered radiation. For the total radiation (unscattered plus scattered) the {dot D}{sub CF} values calculated from the three codes are in very good agreement between them. The comparison between these results and the results deduced previously by other authors indicates a good agreement (less than 15% of difference) for photon energies above 1,500 keV. Antithetically, the agreement is not as good (difference of 20--30%) for the low energy photons.« less

  19. Validation of OpenFoam for heavy gas dispersion applications.

    PubMed

    Mack, A; Spruijt, M P N

    2013-11-15

    In the present paper heavy gas dispersion calculations were performed with OpenFoam. For a wind tunnel test case, numerical data was validated with experiments. For a full scale numerical experiment, a code to code comparison was performed with numerical results obtained from Fluent. The validation was performed in a gravity driven environment (slope), where the heavy gas induced the turbulence. For the code to code comparison, a hypothetical heavy gas release into a strongly turbulent atmospheric boundary layer including terrain effects was selected. The investigations were performed for SF6 and CO2 as heavy gases applying the standard k-ɛ turbulence model. A strong interaction of the heavy gas with the turbulence is present which results in a strong damping of the turbulence and therefore reduced heavy gas mixing. Especially this interaction, based on the buoyancy effects, was studied in order to ensure that the turbulence-buoyancy coupling is the main driver for the reduced mixing and not the global behaviour of the turbulence modelling. For both test cases, comparisons were performed between OpenFoam and Fluent solutions which were mainly in good agreement with each other. Beside steady state solutions, the time accuracy was investigated. In the low turbulence environment (wind tunnel test) which for both codes (laminar solutions) was in good agreement, also with the experimental data. The turbulent solutions of OpenFoam were in much better agreement with the experimental results than the Fluent solutions. Within the strong turbulence environment, both codes showed an excellent comparability. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. An exploration of collaborative scientific production at MIT through spatial organization and institutional affiliation.

    PubMed

    Claudel, Matthew; Massaro, Emanuele; Santi, Paolo; Murray, Fiona; Ratti, Carlo

    2017-01-01

    Academic research is increasingly cross-disciplinary and collaborative, between and within institutions. In this context, what is the role and relevance of an individual's spatial position on a campus? We examine the collaboration patterns of faculty at the Massachusetts Institute of Technology, through their academic output (papers and patents), and their organizational structures (institutional affiliation and spatial configuration) over a 10-year time span. An initial comparison of output types reveals: 1. diverging trends in the composition of collaborative teams over time (size, faculty versus non-faculty, etc.); and 2. substantively different patterns of cross-building and cross-disciplinary collaboration. We then construct a multi-layered network of authors, and find two significant features of collaboration on campus: 1. a network topology and community structure that reveals spatial versus institutional collaboration bias; and 2. a persistent relationship between proximity and collaboration, well fit with an exponential decay model. This relationship is consistent for both papers and patents, and present also in exclusively cross-disciplinary work. These insights contribute an architectural dimension to the field of scientometrics, and take a first step toward empirical space-planning policy that supports collaboration within institutions.

  1. An exploration of collaborative scientific production at MIT through spatial organization and institutional affiliation

    PubMed Central

    Santi, Paolo; Murray, Fiona; Ratti, Carlo

    2017-01-01

    Academic research is increasingly cross-disciplinary and collaborative, between and within institutions. In this context, what is the role and relevance of an individual’s spatial position on a campus? We examine the collaboration patterns of faculty at the Massachusetts Institute of Technology, through their academic output (papers and patents), and their organizational structures (institutional affiliation and spatial configuration) over a 10-year time span. An initial comparison of output types reveals: 1. diverging trends in the composition of collaborative teams over time (size, faculty versus non-faculty, etc.); and 2. substantively different patterns of cross-building and cross-disciplinary collaboration. We then construct a multi-layered network of authors, and find two significant features of collaboration on campus: 1. a network topology and community structure that reveals spatial versus institutional collaboration bias; and 2. a persistent relationship between proximity and collaboration, well fit with an exponential decay model. This relationship is consistent for both papers and patents, and present also in exclusively cross-disciplinary work. These insights contribute an architectural dimension to the field of scientometrics, and take a first step toward empirical space-planning policy that supports collaboration within institutions. PMID:28640829

  2. ECHO: health care performance assessment in several European health systems.

    PubMed

    Bernal-Delgado, E; Christiansen, T; Bloor, K; Mateus, C; Yazbeck, A M; Munck, J; Bremner, J

    2015-02-01

    Strengthening health-care effectiveness, increasing accessibility and improving resilience are key goals in the upcoming European Union health-care agenda. European Collaboration for Health-Care Optimization (ECHO), an international research project on health-care performance assessment funded by the seventh framework programme, has provided evidence and methodology to allow the attainment of those goals. This article aims at describing ECHO, analysing its main instruments and discussing some of the ECHO policy implications. Using patient-level administrative data, a series of observational studies (ecological and cross-section with associated time-series analyses) were conducted to analyze population and patients' exposure to health care. Operationally, several performance dimensions such as health-care inequalities, quality, safety and efficiency were analyzed using a set of validated indicators. The main instruments in ECHO were: (i) building a homogeneous data infrastructure; (ii) constructing coding crosswalks to allow comparisons between countries; (iii) making geographical units of analysis comparable; and (iv) allowing comparisons through the use of common benchmarks. ECHO has provided some innovations in international comparisons of health-care performance, mainly derived from the massive pooling of patient-level data and thus: (i) has expanded the usual approach based on average figures, providing insight into within and across country variation at various meaningful policy levels, (ii) the important effort made on data homogenization has increased comparability, increasing stakeholders' reliance on data and improving the acceptance of findings and (iii) has been able to provide more flexible and reliable benchmarking, allowing stakeholders to make critical use of the evidence. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  3. Verbal Play as an Interactional Discourse Resource in Early Stage Alzheimer’s Disease

    PubMed Central

    Shune, Samantha; Duff, Melissa C.

    2012-01-01

    Background Verbal play, the creative and playful use of language to make puns, rhyme words, and tease, is a pervasive and enjoyable component of social communication and serves important interpersonal functions. The current study examines the use of verbal play in the communicative interactions of individuals with Alzheimer’s disease as part of a broader program of research on language-and-memory-in-use. Aims To document the frequency of verbal play in the communicative interactions of individuals with very mild Alzheimer’s disease (AD) and their familiar communication partners. To characterize the interactional forms, resources, and functions of playful episodes. Methods Using quantitative group comparisons and detailed discourse analysis, we analyzed verbal play in the interactional discourse of five participants with very mild AD and five healthy (demographically matched) comparison participants. Each participant interacted with a familiar partner while completing a collaborative referencing task, and with a researcher between task trials. Results A total of 1,098 verbal play episodes were coded. Despite being in the early stages of AD, all the AD participants used verbal play. There were no significant group differences in the frequency of verbal play episodes or in the interactional forms, resources, or functions of those playful episodes between AD and healthy comparison pair sessions. Conclusions The successful use of verbal play in the interactions of individuals with very mild AD and their partners highlights an area of preserved social communication. These findings represent an important step, both clinically and for research, in documenting the rich ways that individuals with early stage AD orchestrate interactionally meaningful communication with their partners through the use of interactional discourse resources like verbal play. This work also offers a promising clinical tool for tracking and targeting verbal play across disease progression. PMID:23129879

  4. "SEN's Completely Different Now": Critical Discourse Analysis of Three "Codes of Practice for Special Educational Needs" (1994, 2001, 2015)

    ERIC Educational Resources Information Center

    Lehane, Teresa

    2017-01-01

    Regardless of the differing shades of neo-liberalism, successive governments have claimed to champion the cause of "special educational needs and/or disability" (SEND) through official Codes of Practice in 1994, 2001 and 2015. This analysis and comparison of the three Codes of Practice aims to contribute to the debate by exploring…

  5. A Comparison of Inter-Professional Education Programs in Preparing Prospective Teachers and Speech and Language Pathologists for Collaborative Language-Literacy Instruction

    ERIC Educational Resources Information Center

    Wilson, Leanne; McNeill, Brigid; Gillon, Gail T.

    2016-01-01

    Ensuring teacher and speech and language pathology graduates are prepared to work collaboratively together to meet the diverse language literacy learning needs of children is an important goal. This study investigated the efficacy of a 3-h inter-professional education program focused on explicit instruction in the language skills that underpin…

  6. Acceptance of Cloud Services in Face-to-Face Computer-Supported Collaborative Learning: A Comparison between Single-User Mode and Multi-User Mode

    ERIC Educational Resources Information Center

    Wang, Chia-Sui; Huang, Yong-Ming

    2016-01-01

    Face-to-face computer-supported collaborative learning (CSCL) was used extensively to facilitate learning in classrooms. Cloud services not only allow a single user to edit a document, but they also enable multiple users to simultaneously edit a shared document. However, few researchers have compared student acceptance of such services in…

  7. Advancing Collaboration between School- and Agency-Employed School-Based Social Workers: A Mixed-Methods Comparison of Competencies and Preparedness

    ERIC Educational Resources Information Center

    Bronstein, Laura R.; Ball, Annahita; Mellin, Elizabeth A.; Wade-Mdivanian, Rebecca; Anderson-Butcher, Dawn

    2011-01-01

    The purpose of this article is to share results of a mixed-methods research study designed to shed light on similarities and differences between school-employed and agency-employed school-based social workers' preparation and practice as a precursor for collaboration in expanded school mental health. Online survey data from a national sample of…

  8. An overview on ethical considerations in stem cell research in Iran and ethical recommendations: A review.

    PubMed

    Farajkhoda, Tahmineh

    2017-02-01

    Conducting research on the stem cell lines might bring some worthy good to public. Human Stem Cells (hSCs) research has provided opportunities for scientific progresses and new therapies, but some complex ethical matters should be noticed to ensure that stem cell research is carried out in an ethically appropriate manner. The aim of this review article is to discuss the importance of stem cell research, code of ethics for stem cell research in Iran and ethical recommendation. Generation of stem cells for research from human embryo or adult stem cells, saving, maintenance and using of them are the main ethical, legal and jurisprudence concerns in Iran. Concerns regarding human reproduction or human cloning, breach of human dignity, genetic manipulation and probability of tumorogenisity are observed in adult/somatic stem cells. Destruction of embryo to generate stem cell is an important matter in Iran. In this regards, obtaining stem cell from donated frozen embryos through infertility treatment that would be discarded is an acceptable solution in Iran for generation of embryo for research. Ethical, legal, and jurisprudence strategies for using adult/somatic stem cells are determination of ownership of stem cells, trade prohibition of human body, supervision on bio banks and information of Oversight Committee on Stem Cell Research. Recommendations to handle ethical issues for conducting stem cell research are well-designed studies, compliance codes of ethics in biomedical research (specifically codes of ethics on stem cell research, codes of ethics on clinical trials studies and codes of ethics on animals studies), appropriate collaboration with ethics committees and respecting of rights of participants (including both of human and animal rights) in research. In addition, there is a necessity for extending global networks of bioethics for strengthening communications within organizations at both the regional and international level, strengthening legislation systems, designing and establishing convenient collaborative educational courses at different levels.

  9. An overview on ethical considerations in stem cell research in Iran and ethical recommendations: A review

    PubMed Central

    Farajkhoda, Tahmineh

    2017-01-01

    Conducting research on the stem cell lines might bring some worthy good to public. Human Stem Cells (hSCs) research has provided opportunities for scientific progresses and new therapies, but some complex ethical matters should be noticed to ensure that stem cell research is carried out in an ethically appropriate manner. The aim of this review article is to discuss the importance of stem cell research, code of ethics for stem cell research in Iran and ethical recommendation. Generation of stem cells for research from human embryo or adult stem cells, saving, maintenance and using of them are the main ethical, legal and jurisprudence concerns in Iran. Concerns regarding human reproduction or human cloning, breach of human dignity, genetic manipulation and probability of tumorogenisity are observed in adult/somatic stem cells. Destruction of embryo to generate stem cell is an important matter in Iran. In this regards, obtaining stem cell from donated frozen embryos through infertility treatment that would be discarded is an acceptable solution in Iran for generation of embryo for research. Ethical, legal, and jurisprudence strategies for using adult/somatic stem cells are determination of ownership of stem cells, trade prohibition of human body, supervision on bio banks and information of Oversight Committee on Stem Cell Research. Recommendations to handle ethical issues for conducting stem cell research are well-designed studies, compliance codes of ethics in biomedical research (specifically codes of ethics on stem cell research, codes of ethics on clinical trials studies and codes of ethics on animals studies), appropriate collaboration with ethics committees and respecting of rights of participants (including both of human and animal rights) in research. In addition, there is a necessity for extending global networks of bioethics for strengthening communications within organizations at both the regional and international level, strengthening legislation systems, designing and establishing convenient collaborative educational courses at different levels. PMID:28462397

  10. Comparison of Code Predictions to Test Measurements for Two Orifice Compensated Hydrostatic Bearings at High Reynolds Numbers

    NASA Technical Reports Server (NTRS)

    Keba, John E.

    1996-01-01

    Rotordynamic coefficients obtained from testing two different hydrostatic bearings are compared to values predicted by two different computer programs. The first set of test data is from a relatively long (L/D=1) orifice compensated hydrostatic bearing tested in water by Texas A&M University (TAMU Bearing No.9). The second bearing is a shorter (L/D=.37) bearing and was tested in a lower viscosity fluid by Rocketdyne Division of Rockwell (Rocketdyne 'Generic' Bearing) at similar rotating speeds and pressures. Computed predictions of bearing rotordynamic coefficients were obtained from the cylindrical seal code 'ICYL', one of the industrial seal codes developed for NASA-LeRC by Mechanical Technology Inc., and from the hydrodynamic bearing code 'HYDROPAD'. The comparison highlights the difference the bearing has on the accuracy of the predictions. The TAMU Bearing No. 9 test data is closely matched by the predictions obtained for the HYDROPAD code (except for added mass terms) whereas significant differences exist between the data from the Rocketdyne 'Generic' bearing the code predictions. The results suggest that some aspects of the fluid behavior in the shorter, higher Reynolds Number 'Generic' bearing may not be modeled accurately in the codes. The ICYL code predictions for flowrate and direct stiffness approximately equal those of HYDROPAD. Significant differences in cross-coupled stiffness and the damping terms were obtained relative to HYDROPAD and both sets of test data. Several observations are included concerning application of the ICYL code.

  11. Parametric study of potential early commercial power plants Task 3-A MHD cost analysis

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The development of costs for an MHD Power Plant and the comparison of these costs to a conventional coal fired power plant are reported. The program is divided into three activities: (1) code of accounts review; (2) MHD pulverized coal power plant cost comparison; (3) operating and maintenance cost estimates. The scope of each NASA code of account item was defined to assure that the recently completed Task 3 capital cost estimates are consistent with the code of account scope. Improvement confidence in MHD plant capital cost estimates by identifying comparability with conventional pulverized coal fired (PCF) power plant systems is undertaken. The basis for estimating the MHD plant operating and maintenance costs of electricity is verified.

  12. Code comparison for accelerator design and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parsa, Z.

    1988-01-01

    We present a comparison between results obtained from standard accelerator physics codes used for the design and analysis of synchrotrons and storage rings, with programs SYNCH, MAD, HARMON, PATRICIA, PATPET, BETA, DIMAD, MARYLIE and RACE-TRACK. In our analysis we have considered 5 (various size) lattices with large and small angles including AGS Booster (10/degree/ bend), RHIC (2.24/degree/), SXLS, XLS (XUV ring with 45/degree/ bend) and X-RAY rings. The differences in the integration methods used and the treatment of the fringe fields in these codes could lead to different results. The inclusion of nonlinear (e.g., dipole) terms may be necessary inmore » these calculations specially for a small ring. 12 refs., 6 figs., 10 tabs.« less

  13. Comparison of FDNS liquid rocket engine plume computations with SPF/2

    NASA Technical Reports Server (NTRS)

    Kumar, G. N.; Griffith, D. O., II; Warsi, S. A.; Seaford, C. M.

    1993-01-01

    Prediction of a plume's shape and structure is essential to the evaluation of base region environments. The JANNAF standard plume flowfield analysis code SPF/2 predicts plumes well, but cannot analyze base regions. Full Navier-Stokes CFD codes can calculate both zones; however, before they can be used, they must be validated. The CFD code FDNS3D (Finite Difference Navier-Stokes Solver) was used to analyze the single plume of a Space Transportation Main Engine (STME) and comparisons were made with SPF/2 computations. Both frozen and finite rate chemistry models were employed as well as two turbulence models in SPF/2. The results indicate that FDNS3D plume computations agree well with SPF/2 predictions for liquid rocket engine plumes.

  14. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser

    PubMed Central

    Almeida, Jonas S.; Iriabho, Egiebade E.; Gorrepati, Vijaya L.; Wilkinson, Sean R.; Grüneberg, Alexander; Robbins, David E.; Hackney, James R.

    2012-01-01

    Background: Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. Materials and Methods: ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Results: Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. Conclusions: The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local “download and installation”. PMID:22934238

  15. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  16. ImageJS: Personalized, participated, pervasive, and reproducible image bioinformatics in the web browser.

    PubMed

    Almeida, Jonas S; Iriabho, Egiebade E; Gorrepati, Vijaya L; Wilkinson, Sean R; Grüneberg, Alexander; Robbins, David E; Hackney, James R

    2012-01-01

    Image bioinformatics infrastructure typically relies on a combination of server-side high-performance computing and client desktop applications tailored for graphic rendering. On the server side, matrix manipulation environments are often used as the back-end where deployment of specialized analytical workflows takes place. However, neither the server-side nor the client-side desktop solution, by themselves or combined, is conducive to the emergence of open, collaborative, computational ecosystems for image analysis that are both self-sustained and user driven. ImageJS was developed as a browser-based webApp, untethered from a server-side backend, by making use of recent advances in the modern web browser such as a very efficient compiler, high-end graphical rendering capabilities, and I/O tailored for code migration. Multiple versioned code hosting services were used to develop distinct ImageJS modules to illustrate its amenability to collaborative deployment without compromise of reproducibility or provenance. The illustrative examples include modules for image segmentation, feature extraction, and filtering. The deployment of image analysis by code migration is in sharp contrast with the more conventional, heavier, and less safe reliance on data transfer. Accordingly, code and data are loaded into the browser by exactly the same script tag loading mechanism, which offers a number of interesting applications that would be hard to attain with more conventional platforms, such as NIH's popular ImageJ application. The modern web browser was found to be advantageous for image bioinformatics in both the research and clinical environments. This conclusion reflects advantages in deployment scalability and analysis reproducibility, as well as the critical ability to deliver advanced computational statistical procedures machines where access to sensitive data is controlled, that is, without local "download and installation".

  17. Final Report for "Implimentation and Evaluation of Multigrid Linear Solvers into Extended Magnetohydrodynamic Codes for Petascale Computing"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Srinath Vadlamani; Scott Kruger; Travis Austin

    Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less

  18. Development and Verification of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.

  19. Towards seamless workflows in agile data science

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Robertson, J.

    2017-12-01

    Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the geosciences. We use code management that allows researchers to interact with the code through tools like Jupyter Notebooks while data are held in an object store. Our aim is an architecture allowing seamless integration of code development, data management, and data processing in virtual research environments.

  20. On the collaborative design and simulation of space camera: stop structural/thermal/optical) analysis

    NASA Astrophysics Data System (ADS)

    Duan, Pengfei; Lei, Wenping

    2017-11-01

    A number of disciplines (mechanics, structures, thermal, and optics) are needed to design and build Space Camera. Separate design models are normally constructed by each discipline CAD/CAE tools. Design and analysis is conducted largely in parallel subject to requirements that have been levied on each discipline, and technical interaction between the different disciplines is limited and infrequent. As a result a unified view of the Space Camera design across discipline boundaries is not directly possible in the approach above, and generating one would require a large manual, and error-prone process. A collaborative environment that is built on abstract model and performance template allows engineering data and CAD/CAE results to be shared across above discipline boundaries within a common interface, so that it can help to attain speedy multivariate design and directly evaluate optical performance under environment loadings. A small interdisciplinary engineering team from Beijing Institute of Space Mechanics and Electricity has recently conducted a Structural/Thermal/Optical (STOP) analysis of a space camera with this collaborative environment. STOP analysis evaluates the changes in image quality that arise from the structural deformations when the thermal environment of the camera changes throughout its orbit. STOP analyses were conducted for four different test conditions applied during final thermal vacuum (TVAC) testing of the payload on the ground. The STOP Simulation Process begins with importing an integrated CAD model of the camera geometry into the collaborative environment, within which 1. Independent thermal and structural meshes are generated. 2. The thermal mesh and relevant engineering data for material properties and thermal boundary conditions are then used to compute temperature distributions at nodal points in both the thermal and structures mesh through Thermal Desktop, a COTS thermal design and analysis code. 3. Thermally induced structural deformations of the camera are then evaluated in Nastran, an industry standard code for structural design and analysis. 4. Thermal and structural results are next imported into SigFit, another COTS tool that computes deformation and best fit rigid body displacements for the optical surfaces. 5. SigFit creates a modified optical prescription that is imported into CODE V for evaluation of optical performance impacts. The integrated STOP analysis was validated using TVAC test data. For the four different TVAC tests, the relative errors between simulation and test data of measuring points temperatures were almost around 5%, while in some test conditions, they were even much lower to 1%. As to image quality MTF, relative error between simulation and test was 8.3% in the worst condition, others were all below 5%. Through the validation, it has been approved that the collaborative design and simulation environment can achieved the integrated STOP analysis of Space Camera efficiently. And further, the collaborative environment allows an interdisciplinary analysis that formerly might take several months to perform to be completed in two or three weeks, which is very adaptive to scheme demonstration of projects in earlier stages.

  1. POPCORN: A comparison of binary population synthesis codes

    NASA Astrophysics Data System (ADS)

    Claeys, J. S. W.; Toonen, S.; Mennekens, N.

    2013-01-01

    We compare the results of three binary population synthesis codes to understand the differences in their results. As a first result we find that when equalizing the assumptions the results are similar. The main differences arise from deviating physical input.

  2. Applicability of a Bonner Shere technique for pulsed neutron in 120 GeV proton facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sanami, T.; Hagiwara, M.; Iwase, H.

    2008-02-01

    The data on neutron spectra and intensity behind shielding are important for radiation safety design of high-energy accelerators since neutrons are capable of penetrating thick shielding and activating materials. Corresponding particle transport codes--that involve physics models of neutron and other particle production, transportation, and interaction--have been developed and used world-wide [1-8]. The results of these codes have been ensured through plenty of comparisons with experimental results taken in simple geometries. For neutron generation and transport, several related experiments have been performed to measure neutron spectra, attenuation length and reaction rates behind shielding walls of various thicknesses and materials in energymore » range up to several hundred of MeV [9-11]. The data have been used to benchmark--and modify if needed--the simulation modes and parameters in the codes, as well as the reference data for radiation safety design. To obtain such kind of data above several hundred of MeV, Japan-Fermi National Accelerator Laboratory (FNAL) collaboration for shielding experiments has been started in 2007, based on suggestion from the specialist meeting of shielding, Shielding Aspects of Target, Irradiation Facilities (SATIF), because of very limited data available in high-energy region (see, for example, [12]). As a part of this shielding experiment, a set of Bonner sphere (BS) was tested at the antiproton production target facility (pbar target station) at FNAL to obtain neutron spectra induced by a 120-GeV proton beam in concrete and iron shielding. Generally, utilization of an active detector around high-energy accelerators requires an improvement on its readout to overcome burst of secondary radiation since the accelerator delivers an intense beam to a target in a short period after relatively long acceleration period. In this paper, we employ BS for a spectrum measurement of neutrons that penetrate the shielding wall of the pbar target station in FNAL.« less

  3. The HydroServer Platform for Sharing Hydrologic Data

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Horsburgh, J. S.; Schreuders, K.; Maidment, D. R.; Zaslavsky, I.; Valentine, D. W.

    2010-12-01

    The CUAHSI Hydrologic Information System (HIS) is an internet based system that supports sharing of hydrologic data. HIS consists of databases connected using the Internet through Web services, as well as software for data discovery, access, and publication. The HIS system architecture is comprised of servers for publishing and sharing data, a centralized catalog to support cross server data discovery and a desktop client to access and analyze data. This paper focuses on HydroServer, the component developed for sharing and publishing space-time hydrologic datasets. A HydroServer is a computer server that contains a collection of databases, web services, tools, and software applications that allow data producers to store, publish, and manage the data from an experimental watershed or project site. HydroServer is designed to permit publication of data as part of a distributed national/international system, while still locally managing access to the data. We describe the HydroServer architecture and software stack, including tools for managing and publishing time series data for fixed point monitoring sites as well as spatially distributed, GIS datasets that describe a particular study area, watershed, or region. HydroServer adopts a standards based approach to data publication, relying on accepted and emerging standards for data storage and transfer. CUAHSI developed HydroServer code is free with community code development managed through the codeplex open source code repository and development system. There is some reliance on widely used commercial software for general purpose and standard data publication capability. The sharing of data in a common format is one way to stimulate interdisciplinary research and collaboration. It is anticipated that the growing, distributed network of HydroServers will facilitate cross-site comparisons and large scale studies that synthesize information from diverse settings, making the network as a whole greater than the sum of its parts in advancing hydrologic research. Details of the CUAHSI HIS can be found at http://his.cuahsi.org, and HydroServer codeplex site http://hydroserver.codeplex.com.

  4. Distributed polar-coded OFDM based on Plotkin's construction for half duplex wireless communication

    NASA Astrophysics Data System (ADS)

    Umar, Rahim; Yang, Fengfan; Mughal, Shoaib; Xu, HongJun

    2018-07-01

    A Plotkin-based polar-coded orthogonal frequency division multiplexing (P-PC-OFDM) scheme is proposed and its bit error rate (BER) performance over additive white gaussian noise (AWGN), frequency selective Rayleigh, Rician and Nakagami-m fading channels has been evaluated. The considered Plotkin's construction possesses a parallel split in its structure, which motivated us to extend the proposed P-PC-OFDM scheme in a coded cooperative scenario. As the relay's effective collaboration has always been pivotal in the design of cooperative communication therefore, an efficient selection criterion for choosing the information bits has been inculcated at the relay node. To assess the BER performance of the proposed cooperative scheme, we have also upgraded conventional polar-coded cooperative scheme in the context of OFDM as an appropriate bench marker. The Monte Carlo simulated results revealed that the proposed Plotkin-based polar-coded cooperative OFDM scheme convincingly outperforms the conventional polar-coded cooperative OFDM scheme by 0.5 0.6 dBs over AWGN channel. This prominent gain in BER performance is made possible due to the bit-selection criteria and the joint successive cancellation decoding adopted at the relay and the destination nodes, respectively. Furthermore, the proposed coded cooperative schemes outperform their corresponding non-cooperative schemes by a gain of 1 dB under an identical condition.

  5. Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory

    PubMed Central

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-01-01

    Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625

  6. Qualitative data analysis for health services research: developing taxonomy, themes, and theory.

    PubMed

    Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J

    2007-08-01

    To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.

  7. Implementation of a Blowing Boundary Condition in the LAURA Code

    NASA Technical Reports Server (NTRS)

    Thompson, Richard a.; Gnoffo, Peter A.

    2008-01-01

    Preliminary steps toward modeling a coupled ablation problem using a finite-volume Navier-Stokes code (LAURA) are presented in this paper. Implementation of a surface boundary condition with mass transfer (blowing) is described followed by verification and validation through comparisons with analytic results and experimental data. Application of the code to a carbon-nosetip ablation problem is demonstrated and the results are compared with previously published data. It is concluded that the code and coupled procedure are suitable to support further ablation analyses and studies.

  8. A Comparison of MOOC Development and Delivery Approaches

    ERIC Educational Resources Information Center

    Smith, Neil; Caldwell, Helen; Richards, Mike; Bandara, Arosha

    2017-01-01

    Purpose: The purpose of this paper is to present a comparison of two ways of developing and delivering massive open online courses (MOOCs). One was developed by The Open University in collaboration with FutureLearn; the other was developed independently by a small team at the Northampton University. Design/methodology/approach: The different…

  9. The Role of Collaborations in Sustaining an Evidence-Based Intervention to Reduce Child Neglect

    PubMed Central

    Green, Amy E.; Trott, Elise; Willging, Cathleen E.; Finn, Natalie K.; Ehrhart, Mark G.; Aarons, Gregory A.

    2016-01-01

    Child neglect is the most prevalent form of child maltreatment and represents 79.5% of open child-welfare cases. A recent study found the evidence-based intervention (EBI) SafeCare® (SC) to significantly reduce child neglect recidivism rates. To fully capitalize on the effectiveness of such EBIs, service systems must engage in successful implementation and sustainment; however, little is known regarding what factors influence EBI sustainment. Collaborations among stakeholders are suggested as a means for facilitating EBI implementation and sustainment. This study combines descriptive quantitative survey data with qualitative interview and focus group findings to examine the role of collaboration within the context of public-private partnerships in 11 child welfare systems implementing SC. Participants included administrators of government child welfare systems and community-based organizations, as well as supervisors, coaches, and home visitors of the SC program. Sites were classified as fully-, partially-, and non-sustaining based on implementation fidelity. One-way analysis of variance was used to examine differences in stakeholder reported Effective Collaboration scores across fully-sustaining, partially-sustaining, and non-sustaining sites. Qualitative transcripts were analyzed via open and focused coding to identify the commonality, diversity, and complexity of collaborations involved in implementing and sustaining SC. Fully-sustaining sites reported significantly greater levels of effective collaboration than non-sustaining sites. Key themes described by SC stakeholders included shared vision, building on existing relationships, academic support, problem solving and resource sharing, and maintaining collaborations over time. Both quantitative and qualitative results converge in highlighting the importance of effective collaboration in EBI sustainment in child welfare service systems. PMID:26712422

  10. Viewpoint: a comparison of cause-of-injury coding in U.S. military and civilian hospitals.

    PubMed

    Amoroso, P J; Bell, N S; Smith, G S; Senier, L; Pickett, D

    2000-04-01

    Complete and accurate coding of injury causes is essential to the understanding of injury etiology and to the development and evaluation of injury-prevention strategies. While civilian hospitals use ICD-9-CM external cause-of-injury codes, military hospitals use codes derived from the NATO Standardization Agreement (STANAG) 2050. The STANAG uses two separate variables to code injury cause. The Trauma code uses a single digit with 10 possible values to identify the general class of injury as battle injury, intentionally inflicted nonbattle injury, or unintentional injury. The Injury code is used to identify cause or activity at the time of the injury. For a subset of the Injury codes, the last digit is modified to indicate place of occurrence. This simple system contains fewer than 300 basic codes, including many that are specific to battle- and sports-related injuries not coded well by either the ICD-9-CM or the draft ICD-10-CM. However, while falls, poisonings, and injuries due to machinery and tools are common causes of injury hospitalizations in the military, few STANAG codes correspond to these events. Intentional injuries in general and sexual assaults in particular are also not well represented in the STANAG. Because the STANAG does not map directly to the ICD-9-CM system, quantitative comparisons between military and civilian data are difficult. The ICD-10-CM, which will be implemented in the United States sometime after 2001, expands considerably on its predecessor, ICD-9-CM, and provides more specificity and detail than the STANAG. With slight modification, it might become a suitable replacement for the STANAG.

  11. Use of the Collaborative Optimization Architecture for Launch Vehicle Design

    NASA Technical Reports Server (NTRS)

    Braun, R. D.; Moore, A. A.; Kroo, I. M.

    1996-01-01

    Collaborative optimization is a new design architecture specifically created for large-scale distributed-analysis applications. In this approach, problem is decomposed into a user-defined number of subspace optimization problems that are driven towards interdisciplinary compatibility and the appropriate solution by a system-level coordination process. This decentralized design strategy allows domain-specific issues to be accommodated by disciplinary analysts, while requiring interdisciplinary decisions to be reached by consensus. The present investigation focuses on application of the collaborative optimization architecture to the multidisciplinary design of a single-stage-to-orbit launch vehicle. Vehicle design, trajectory, and cost issues are directly modeled. Posed to suit the collaborative architecture, the design problem is characterized by 5 design variables and 16 constraints. Numerous collaborative solutions are obtained. Comparison of these solutions demonstrates the influence which an priori ascent-abort criterion has on development cost. Similarly, objective-function selection is discussed, demonstrating the difference between minimum weight and minimum cost concepts. The operational advantages of the collaborative optimization

  12. EVALUATION OF AN INDIVIDUALLY PACED COURSE FOR AIRBORNE RADIO CODE OPERATORS. FINAL REPORT.

    ERIC Educational Resources Information Center

    BALDWIN, ROBERT O.; JOHNSON, KIRK A.

    IN THIS STUDY COMPARISONS WERE MADE BETWEEN AN INDIVIDUALLY PACED VERSION OF THE AIRBORNE RADIO CODE OPERATOR (ARCO) COURSE AND TWO VERSIONS OF THE COURSE IN WHICH THE STUDENTS PROGRESSED AT A FIXED PACE. THE ARCO COURSE IS A CLASS C SCHOOL IN WHICH THE STUDENT LEARNS TO SEND AND RECEIVE MILITARY MESSAGES USING THE INTERNATIONAL MORSE CODE. THE…

  13. The Contract Management Body of Knowledge: A Comparison of Contracting Competencies

    DTIC Science & Technology

    2013-12-01

    SME subject matter expert SOW statement of work TINA Truth in Negotiations Act UCC uniform commercial code WBS work breakdown structure xv...documents whose terms and condition are legally enforceable. Sources of law and guidance covered include the uniform commercial code ( UCC ), Federal...contracting including the uniform commercial code ( UCC ), Federal Acquisition Regulation (FAR), as well as various other laws pertaining to both

  14. Integrated Modeling of the Battlespace Environment

    DTIC Science & Technology

    2010-10-01

    Office of Counsel.Code 1008.3 ADOR/Director NCST E. R. Franchi , 7000 Public Affairs (Unclassified/ Unlimited Only). Code 7030 4 Division, Code...ESMF: the Hakamada- Akasofu-Fry version 2 (HAFv2) solar wind model and the global assimilation of ionospheric mea- surements (GAIM1) forecast...ground-truth measurements for comparison with the solar wind predictions. Global Assimilation of Ionospheric Measurements The GAIMv2.3 effort

  15. Life In a large scientific collaboration

    NASA Astrophysics Data System (ADS)

    Pravahan, Rishiraj

    2011-03-01

    I will be talking about life in a large scientific collaboration. The dynamics of dealing with many groups, collaborating with people from various linguistic and cultural origins can be a daunting experience. However, it is exactly this diversity of culture and learning that can make it an invigorating journey. You need to find your place in terms of professional contribution as well as personal liaisons to be productive and innovative in a large work culture. Scientific problems today are not solved by one person hunched over an old notebook. It is solved by sharing computer codes, experimental infrastructure and your questions over coffee with your colleagues. An affinity to take in and impart healthy criticism is a must for productive throughput of work. I will discuss all these aspects as well as issues that may arise from adjusting to a new country, customs, food, transportation or health-care system. The purpose of the talk is to familiarize you with what I have learned through my past five years of stay at CERN and working in the ATLAS collaboration.

  16. Interprofessional education day - an evaluation of an introductory experience for first-year students.

    PubMed

    Singer, Zachary; Fung, Kevin; Lillie, Elaine; McLeod, Jennifer; Scott, Grace; You, Peng; Helleman, Krista

    2018-05-01

    Interprofessional health care teams have been shown to improve patient safety and reduce medical errors, among other benefits. Introducing interprofessional concepts to students in full day events is an established model that allows students to learn together. Our group developed an academic day for first-year students devoted to an introductory interprofessional education (IPE) experience, 'IPE Day'. In total, 438 students representing medicine, dentistry, pharmacy and optometry gathered together, along with 25 facilitators, for IPE Day. Following the day's program, students completed the evaluation consisting of the Interprofessional Collaborative Competencies Attainment Survey and open-ended questions. Narrative responses were analyzed for content and coded using the Canadian Interprofessional Health Collaborative competency domains. Three hundred and eight evaluations were completed. Students reported increased self-ratings of competency across all 20 items (p < 0.05). Their comments were organized into the six domains: interprofessional communication, collaborative leadership, role clarification, patient-centred care, conflict resolution, and team functioning. Based on these findings, we suggest that this IPE activity may be useful for improving learner perceptions about their interprofessional collaborative practice competence.

  17. CDinFusion – Submission-Ready, On-Line Integration of Sequence and Contextual Data

    PubMed Central

    Hankeln, Wolfgang; Wendel, Norma Johanna; Gerken, Jan; Waldmann, Jost; Buttigieg, Pier Luigi; Kostadinov, Ivaylo; Kottmann, Renzo; Yilmaz, Pelin; Glöckner, Frank Oliver

    2011-01-01

    State of the art (DNA) sequencing methods applied in “Omics” studies grant insight into the ‘blueprints’ of organisms from all domains of life. Sequencing is carried out around the globe and the data is submitted to the public repositories of the International Nucleotide Sequence Database Collaboration. However, the context in which these studies are conducted often gets lost, because experimental data, as well as information about the environment are rarely submitted along with the sequence data. If these contextual or metadata are missing, key opportunities of comparison and analysis across studies and habitats are hampered or even impossible. To address this problem, the Genomic Standards Consortium (GSC) promotes checklists and standards to better describe our sequence data collection and to promote the capturing, exchange and integration of sequence data with contextual data. In a recent community effort the GSC has developed a series of recommendations for contextual data that should be submitted along with sequence data. To support the scientific community to significantly enhance the quality and quantity of contextual data in the public sequence data repositories, specialized software tools are needed. In this work we present CDinFusion, a web-based tool to integrate contextual and sequence data in (Multi)FASTA format prior to submission. The tool is open source and available under the Lesser GNU Public License 3. A public installation is hosted and maintained at the Max Planck Institute for Marine Microbiology at http://www.megx.net/cdinfusion. The tool may also be installed locally using the open source code available at http://code.google.com/p/cdinfusion. PMID:21935468

  18. Evaluation of H.264 and H.265 full motion video encoding for small UAS platforms

    NASA Astrophysics Data System (ADS)

    McGuinness, Christopher D.; Walker, David; Taylor, Clark; Hill, Kerry; Hoffman, Marc

    2016-05-01

    Of all the steps in the image acquisition and formation pipeline, compression is the only process that degrades image quality. A selected compression algorithm succeeds or fails to provide sufficient quality at the requested compression rate depending on how well the algorithm is suited to the input data. Applying an algorithm designed for one type of data to a different type often results in poor compression performance. This is mostly the case when comparing the performance of H.264, designed for standard definition data, to HEVC (High Efficiency Video Coding), which the Joint Collaborative Team on Video Coding (JCT-VC) designed for high-definition data. This study focuses on evaluating how HEVC compares to H.264 when compressing data from small UAS platforms. To compare the standards directly, we assess two open-source traditional software solutions: x264 and x265. These software-only comparisons allow us to establish a baseline of how much improvement can generally be expected of HEVC over H.264. Then, specific solutions leveraging different types of hardware are selected to understand the limitations of commercial-off-the-shelf (COTS) options. Algorithmically, regardless of the implementation, HEVC is found to provide similar quality video as H.264 at 40% lower data rates for video resolutions greater than 1280x720, roughly 1 Megapixel (MPx). For resolutions less than 1MPx, H.264 is an adequate solution though a small (roughly 20%) compression boost is earned by employing HEVC. New low cost, size, weight, and power (CSWAP) HEVC implementations are being developed and will be ideal for small UAS systems.

  19. University-Industry Collaboration in China and the USA: A Bibliometric Comparison.

    PubMed

    Zhou, Ping; Tijssen, Robert; Leydesdorff, Loet

    2016-01-01

    In this study, university-industry collaborations in China and the USA are analyzed in terms of co-authored publications indexed in the Web of Science (WoS). Results show a wide gap between China and the USA: Chinese universities are much less active in collaborations with industry in terms of either publication productivity or collaboration intensity. In selecting local and foreign industrial partners, however, more variation exists among Chinese universities than among US universities. The US system is domestically oriented more than that of China. In the USA, the intensity of university-industry collaboration is determined by research quality, whereas in China this is not the case. In both China and the USA, distance is not critical for the establishment of domestic university-industry collaboration. A high correlation is found between productivity indicators including total publications and university-industry co-authored publications. However, the productivity indicators are less correlated with the intensity of university-industry collaboration. Large research universities with strong ties to domestic industry play critical roles in both national publication systems.

  20. University-Industry Collaboration in China and the USA: A Bibliometric Comparison

    PubMed Central

    2016-01-01

    In this study, university-industry collaborations in China and the USA are analyzed in terms of co-authored publications indexed in the Web of Science (WoS). Results show a wide gap between China and the USA: Chinese universities are much less active in collaborations with industry in terms of either publication productivity or collaboration intensity. In selecting local and foreign industrial partners, however, more variation exists among Chinese universities than among US universities. The US system is domestically oriented more than that of China. In the USA, the intensity of university-industry collaboration is determined by research quality, whereas in China this is not the case. In both China and the USA, distance is not critical for the establishment of domestic university-industry collaboration. A high correlation is found between productivity indicators including total publications and university-industry co-authored publications. However, the productivity indicators are less correlated with the intensity of university-industry collaboration. Large research universities with strong ties to domestic industry play critical roles in both national publication systems. PMID:27832084

  1. Web Service Model for Plasma Simulations with Automatic Post Processing and Generation of Visual Diagnostics*

    NASA Astrophysics Data System (ADS)

    Exby, J.; Busby, R.; Dimitrov, D. A.; Bruhwiler, D.; Cary, J. R.

    2003-10-01

    We present our design and initial implementation of a web service model for running particle-in-cell (PIC) codes remotely from a web browser interface. PIC codes have grown significantly in complexity and now often require parallel execution on multiprocessor computers, which in turn requires sophisticated post-processing and data analysis. A significant amount of time and effort is required for a physicist to develop all the necessary skills, at the expense of actually doing research. Moreover, parameter studies with a computationally intensive code justify the systematic management of results with an efficient way to communicate them among a group of remotely located collaborators. Our initial implementation uses the OOPIC Pro code [1], Linux, Apache, MySQL, Python, and PHP. The Interactive Data Language is used for visualization. [1] D.L. Bruhwiler et al., Phys. Rev. ST-AB 4, 101302 (2001). * This work is supported by DOE grant # DE-FG02-03ER83857 and by Tech-X Corp. ** Also University of Colorado.

  2. Progress with the COGENT Edge Kinetic Code: Implementing the Fokker-Plank Collision Operator

    DOE PAGES

    Dorf, M. A.; Cohen, R. H.; Dorr, M.; ...

    2014-06-20

    Here, COGENT is a continuum gyrokinetic code for edge plasma simulations being developed by the Edge Simulation Laboratory collaboration. The code is distinguished by application of a fourth-order finite-volume (conservative) discretization, and mapped multiblock grid technology to handle the geometric complexity of the tokamak edge. The distribution function F is discretized in v∥ – μ (parallel velocity – magnetic moment) velocity coordinates, and the code presently solves an axisymmetric full-f gyro-kinetic equation coupled to the long-wavelength limit of the gyro-Poisson equation. COGENT capabilities are extended by implementing the fully nonlinear Fokker-Plank operator to model Coulomb collisions in magnetized edge plasmas.more » The corresponding Rosenbluth potentials are computed by making use of a finite-difference scheme and multipole-expansion boundary conditions. Details of the numerical algorithms and results of the initial verification studies are discussed. (© 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)« less

  3. Comparison of a 3-D GPU-Assisted Maxwell Code and Ray Tracing for Reflectometry on ITER

    NASA Astrophysics Data System (ADS)

    Gady, Sarah; Kubota, Shigeyuki; Johnson, Irena

    2015-11-01

    Electromagnetic wave propagation and scattering in magnetized plasmas are important diagnostics for high temperature plasmas. 1-D and 2-D full-wave codes are standard tools for measurements of the electron density profile and fluctuations; however, ray tracing results have shown that beam propagation in tokamak plasmas is inherently a 3-D problem. The GPU-Assisted Maxwell Code utilizes the FDTD (Finite-Difference Time-Domain) method for solving the Maxwell equations with the cold plasma approximation in a 3-D geometry. Parallel processing with GPGPU (General-Purpose computing on Graphics Processing Units) is used to accelerate the computation. Previously, we reported on initial comparisons of the code results to 1-D numerical and analytical solutions, where the size of the computational grid was limited by the on-board memory of the GPU. In the current study, this limitation is overcome by using domain decomposition and an additional GPU. As a practical application, this code is used to study the current design of the ITER Low Field Side Reflectometer (LSFR) for the Equatorial Port Plug 11 (EPP11). A detailed examination of Gaussian beam propagation in the ITER edge plasma will be presented, as well as comparisons with ray tracing. This work was made possible by funding from the Department of Energy for the Summer Undergraduate Laboratory Internship (SULI) program. This work is supported by the US DOE Contract No.DE-AC02-09CH11466 and DE-FG02-99-ER54527.

  4. A comparison of IBC with 1997 UBC for modal response spectrum analysis in standard-occupancy buildings

    NASA Astrophysics Data System (ADS)

    Nahhas, Tariq M.

    2011-03-01

    This paper presents a comparison of the seismic forces generated from a Modal Response Spectrum Analysis (MRSA) by applying the provisions of two building codes, the 1997 Uniform Building Code (UBC) and the 2000-2009 International Building Code (IBC), to the most common ordinary residential buildings of standard occupancy. Considering IBC as the state of the art benchmark code, the primary concern is the safety of buildings designed using the UBC as compared to those designed using the IBC. A sample of four buildings with different layouts and heights was used for this comparison. Each of these buildings was assumed to be located at four different geographical sample locations arbitrarily selected to represent various earthquake zones on a seismic map of the USA, and was subjected to code-compliant response spectrum analyses for all sample locations and for five different soil types at each location. Response spectrum analysis was performed using the ETABS software package. For all the cases investigated, the UBC was found to be significantly more conservative than the IBC. The UBC design response spectra have higher spectral accelerations, and as a result, the response spectrum analysis provided a much higher base shear and moment in the structural members as compared to the IBC. The conclusion is that ordinary office and residential buildings designed using UBC 1997 are considered to be overdesigned, and therefore they are quite safe even according to the IBC provisions.

  5. A Deterministic Transport Code for Space Environment Electrons

    NASA Technical Reports Server (NTRS)

    Nealy, John E.; Chang, C. K.; Norman, Ryan B.; Blattnig, Steve R.; Badavi, Francis F.; Adamczyk, Anne M.

    2010-01-01

    A deterministic computational procedure has been developed to describe transport of space environment electrons in various shield media. This code is an upgrade and extension of an earlier electron code. Whereas the former code was formulated on the basis of parametric functions derived from limited laboratory data, the present code utilizes well established theoretical representations to describe the relevant interactions and transport processes. The shield material specification has been made more general, as have the pertinent cross sections. A combined mean free path and average trajectory approach has been used in the transport formalism. Comparisons with Monte Carlo calculations are presented.

  6. Overview of the H.264/AVC video coding standard

    NASA Astrophysics Data System (ADS)

    Luthra, Ajay; Topiwala, Pankaj N.

    2003-11-01

    H.264/MPEG-4 AVC is the latest coding standard jointly developed by the Video Coding Experts Group (VCEG) of ITU-T and Moving Picture Experts Group (MPEG) of ISO/IEC. It uses state of the art coding tools and provides enhanced coding efficiency for a wide range of applications including video telephony, video conferencing, TV, storage (DVD and/or hard disk based), streaming video, digital video creation, digital cinema and others. In this paper an overview of this standard is provided. Some comparisons with the existing standards, MPEG-2 and MPEG-4 Part 2, are also provided.

  7. Comparison of USDA Forest Service and stakeholder motivations and experiences in collaborative federal forest governance in the Western United States

    Treesearch

    Emily Jane Davis; Eric M. White; Lee K. Cerveny; David Seesholtz; Meagan L. Nuss; Donald R. Ulrich

    2017-01-01

    In the United States, over 191 million acres of land is managed by the United States Department of Agriculture Forest Service, a federal government agency. In several western U.S. states, organized collaborative groups have become a de facto governance approach to providing sustained input on management decisions on much public land. This is most extensive in Oregon,...

  8. A Comparison of Correctional Adult Educators and Formal Adult Educators in Terms of Their Expressed Beliefs in the Collaborative Teaching Mode. Theory and Methods of Adult Education.

    ERIC Educational Resources Information Center

    Sua, Dangbe Wuo

    A study compared correctional adult educators and formal adult educators in terms of their expressed beliefs in the collaborative teaching mode as measured by the Principles of Adult Learning Scale. The sample consisted of 8 correctional adult educators from the Lake Correctional Institution and 10 adult education teachers from the Manatee Area…

  9. High Temperature Ultrasonic Transducers : Material Selection and Testing

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph; Bruno, Alessandro

    2012-01-01

    The task of my two-months internship was to test different materials to be used to build an high temperature transducer, to develop some prototypes and to test their performance, to assess the reliability of commercial product rated for such a temperature, as well as to collaborate in developing the signal processing code to measure the condensed water levels.

  10. The Role of Scaffolding in CSCL in General and in Specific Environments

    ERIC Educational Resources Information Center

    Verdú, N.; Sanuy, J.

    2014-01-01

    This paper aims to analyse if virtual forums set up in an environment specifically designed to improve collaborative learning can effectively influence students' discourse quality and learning when compared with those forums set up in a general environment. Following a coding schema based upon the set of scaffolds offered in the Knowledge…

  11. Aquaporin 2 of Rhipicephalus (Boophilus) microplus as a potential target to control ticks and tick-borne parasites

    USDA-ARS?s Scientific Manuscript database

    In a collaboration with Washington State University and ARS-Pullman, WA researchers, we identified and sequenced a 1,059 base pair Rhipicephalus microplus transcript that contained the coding region for a water channel protein, Aquaporin 2 (RmAQP2). The clone sequencing resulted in the production of...

  12. NREL: International Activities - U.S.-China Renewable Energy Partnership

    Science.gov Websites

    Solar PV and TC88 Wind working groups. Renewable Energy Technology These projects enhance policies to Collaboration on innovative business models and financing solutions for solar PV deployment. Micrositing and O development. Current Projects Recommendations for photovoltaic (PV) and wind grid code updates. New energy

  13. The Code Red Project: Engaging Communities in Health System Change in Hamilton, Canada

    ERIC Educational Resources Information Center

    DeLuca, Patrick F.; Buist, Steve; Johnston, Neil

    2012-01-01

    The communication of determinants of health and health outcomes normally executed through academic channels often fail to reach lay audiences. In April of 2010, the results of collaboration between academe and mass media were published in the Hamilton Spectator, one of Canada's 10 largest English-language daily newspapers as a 7-day series. The…

  14. Multiobjective Collaborative Optimization of Systems of Systems

    DTIC Science & Technology

    2005-06-01

    K: HSC MODEL AND OPTIMIZATION DESCRIPTION ................................................ 157 APPENDIX L: HSC OPTIMIZATION CODE...7 0 Table 6. System Variables of FPF Data Set Showing Minimal HSC Impact on...App.E, F) Data Analysis Front ITS Model (App. I, J) Chap.] 1 ConclusionsSHSC Model (App. K, L) Cot[& HSC Model (App. M, NV) MoeJ Future Work Figure

  15. Acetylcholinesterase 1 in populations of organophosphate resistant North American strains of the cattle tick, Rhipicephalus microplus (Acari: Ixodidae)

    USDA-ARS?s Scientific Manuscript database

    In a collaboration with Purdue University researchers, we sequenced a 143,606 base pair Rhipicephalus microplus BAC library clone that contained the coding region for acetylcholinesterase 1 (AChE1). Sequencing was by Sanger protocols and the final assembly resulted in 15 contigs of varying length, e...

  16. Collaborative Research Program on Advanced Metals and Ceramics for Armor and Anti-Armor Applications Dynamic Behavior of Non-Crystalline and Crystalline Metallic Systems

    DTIC Science & Technology

    2006-09-01

    compression, including real-time cinematography of failure under dynamic compression, was evaluated. The results (figure 10) clearly show that the failure... art of simulations of dynamic failure and damage mechanisms. An explicit dynamic parallel code has been developed to track damage mechanisms in the

  17. Asynchronous Training in Pharmaceutical Manufacturing: A Model for University and Industrial Collaboration

    ERIC Educational Resources Information Center

    Elliot, Norbert; Haggerty, Blake; Foster, Mary; Spak, Gale

    2008-01-01

    The present study documents the results of a 17-month program to train Cardinal Health Pharmaceutical Technology Services (PTS) employees in an innovative model that combines investigative and writing techniques. Designed to address the Code of Federal Regulations (CFR) for the United States Food and Drug Administration (FDA), the program is a…

  18. The Pupil Nondiscrimination Guidelines for Athletics. Implementing Section 118.13 of the Wisconsin Statutes and PI 9 of the Wisconsin Administrative Code.

    ERIC Educational Resources Information Center

    Wisconsin Interscholastic Athletic Association.

    These guidelines explaining state pupil nondiscrimination requirements in interscholastic athletics are the result of a collaboration between the Wisconsin Department of Public Instruction and the Wisconsin Interscholastic Athletic Association (WIAA). The guide is designed to help schools fully implement Wisconsin's pupil nondiscrimination…

  19. Teacher Professional Development through a Collaborative Curriculum Project--An Example of TPACK in Maine

    ERIC Educational Resources Information Center

    Allan, Walter C.; Erickson, Jeryl L.; Brookhouse, Phil; Johnson, Judith L.

    2010-01-01

    Maine's one-to-one laptop program provides an ideal opportunity to explore conditions that optimize teacher integration of technology-focused curriculum into the classroom. EcoScienceWorks (ESW) is an ecology curriculum that includes targeted simulations and a code block programming challenge developed through an NSF-ITEST grant. The project was…

  20. A Boundary Condition for Simulation of Flow Over Porous Surfaces

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Bonhaus, Daryl L.; Vatsa, Veer N.; Bauer, Steven X. S.; Tinetti, Ana F.

    2001-01-01

    A new boundary condition is presented.for simulating the flow over passively porous surfaces. The model builds on the prior work of R.H. Bush to eliminate the need for constructing grid within an underlying plenum, thereby simplifying the numerical modeling of passively porous flow control systems and reducing computation cost. Code experts.for two structured-grid.flow solvers, TLNS3D and CFL3D. and one unstructured solver, USM3Dns, collaborated with an experimental porosity expert to develop the model and implement it into their respective codes. Results presented,for the three codes on a slender forebody with circumferential porosity and a wing with leading-edge porosity demonstrate a good agreement with experimental data and a remarkable ability to predict the aggregate aerodynamic effects of surface porosity with a simple boundary condition.

  1. [Random Variable Read Me File

    NASA Technical Reports Server (NTRS)

    Teubert, Christopher; Sankararaman, Shankar; Cullo, Aiden

    2017-01-01

    Readme for the Random Variable Toolbox usable manner. is a Web-based Git version control repository hosting service. It is mostly used for computer code. It offers all of the distributed version control and source code management (SCM) functionality of Git as well as adding its own features. It provides access control and several collaboration features such as bug tracking, feature requests, task management, and wikis for every project.[3] GitHub offers both plans for private and free repositories on the same account[4] which are commonly used to host open-source software projects.[5] As of April 2017, GitHub reports having almost 20 million users and 57 million repositories,[6] making it the largest host of source code in the world.[7] GitHub has a mascot called Octocat, a cat with five tentacles and a human-like face

  2. Computing Legacy Software Behavior to Understand Functionality and Security Properties: An IBM/370 Demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J

    Organizations maintaining mainframe legacy software can benefit from code modernization and incorporation of security capabilities to address the current threat environment. Oak Ridge National Laboratory is developing the Hyperion system to compute the behavior of software as a means to gain understanding of software functionality and security properties. Computation of functionality is critical to revealing security attributes, which are in fact specialized functional behaviors of software. Oak Ridge is collaborating with MITRE Corporation to conduct a demonstration project to compute behavior of legacy IBM Assembly Language code for a federal agency. The ultimate goal is to understand functionality and securitymore » vulnerabilities as a basis for code modernization. This paper reports on the first phase, to define functional semantics for IBM Assembly instructions and conduct behavior computation experiments.« less

  3. Progress with the COGENT Edge Kinetic Code: Collision operator options

    DOE PAGES

    Dorf, M. A.; Cohen, R. H.; Compton, J. C.; ...

    2012-06-27

    In this study, COGENT is a continuum gyrokinetic code for edge plasmas being developed by the Edge Simulation Laboratory collaboration. The code is distinguished by application of the fourth order conservative discretization, and mapped multiblock grid technology to handle the geometric complexity of the tokamak edge. It is written in v∥-μ (parallel velocity – magnetic moment) velocity coordinates, and making use of the gyrokinetic Poisson equation for the calculation of a self-consistent electric potential. In the present manuscript we report on the implementation and initial testing of a succession of increasingly detailed collision operator options, including a simple drag-diffusion operatormore » in the parallel velocity space, Lorentz collisions, and a linearized model Fokker-Planck collision operator conserving momentum and energy (© 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)« less

  4. TRAC posttest calculations of Semiscale Test S-06-3. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ireland, J.R.; Bleiweis, P.B.

    A comparison of Transient Reactor Analysis Code (TRAC) steady-state and transient results with Semiscale Test S-06-3 (US Standard Problem 8) experimental data is discussed. The TRAC model used employs fewer mesh cells than normal data comparison models so that TRAC's ability to obtain reasonable results with less computer time can be assessed. In general, the TRAC results are in good agreement with the data and the major phenomena found in the experiment are reproduced by the code with a substantial reduction in computing times.

  5. Comparisons for ESTA-Task3: ASTEC, CESAM and CLÉS

    NASA Astrophysics Data System (ADS)

    Christensen-Dalsgaard, J.

    The ESTA activity under the CoRoT project aims at testing the tools for computing stellar models and oscillation frequencies that will be used in the analysis of asteroseismic data from CoRoT and other large-scale upcoming asteroseismic projects. Here I report results of comparisons between calculations using the Aarhus code (ASTEC) and two other codes, for models that include diffusion and settling. It is found that there are likely deficiencies, requiring further study, in the ASTEC computation of models including convective cores.

  6. Thermodynamic equilibrium-air correlations for flowfield applications

    NASA Technical Reports Server (NTRS)

    Zoby, E. V.; Moss, J. N.

    1981-01-01

    Equilibrium-air thermodynamic correlations have been developed for flowfield calculation procedures. A comparison between the postshock results computed by the correlation equations and detailed chemistry calculations is very good. The thermodynamic correlations are incorporated in an approximate inviscid flowfield code with a convective heating capability for the purpose of defining the thermodynamic environment through the shock layer. Comparisons of heating rates computed by the approximate code and a viscous-shock-layer method are good. In addition to presenting the thermodynamic correlations, the impact of several viscosity models on the convective heat transfer is demonstrated.

  7. Tailored Codes for Small Quantum Memories

    NASA Astrophysics Data System (ADS)

    Robertson, Alan; Granade, Christopher; Bartlett, Stephen D.; Flammia, Steven T.

    2017-12-01

    We demonstrate that small quantum memories, realized via quantum error correction in multiqubit devices, can benefit substantially by choosing a quantum code that is tailored to the relevant error model of the system. For a biased noise model, with independent bit and phase flips occurring at different rates, we show that a single code greatly outperforms the well-studied Steane code across the full range of parameters of the noise model, including for unbiased noise. In fact, this tailored code performs almost optimally when compared with 10 000 randomly selected stabilizer codes of comparable experimental complexity. Tailored codes can even outperform the Steane code with realistic experimental noise, and without any increase in the experimental complexity, as we demonstrate by comparison in the observed error model in a recent seven-qubit trapped ion experiment.

  8. Ex-Vessel Core Melt Modeling Comparison between MELTSPREAD-CORQUENCH and MELCOR 2.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robb, Kevin R.; Farmer, Mitchell; Francis, Matthew W.

    System-level code analyses by both United States and international researchers predict major core melting, bottom head failure, and corium-concrete interaction for Fukushima Daiichi Unit 1 (1F1). Although system codes such as MELCOR and MAAP are capable of capturing a wide range of accident phenomena, they currently do not contain detailed models for evaluating some ex-vessel core melt behavior. However, specialized codes containing more detailed modeling are available for melt spreading such as MELTSPREAD as well as long-term molten corium-concrete interaction (MCCI) and debris coolability such as CORQUENCH. In a preceding study, Enhanced Ex-Vessel Analysis for Fukushima Daiichi Unit 1: Meltmore » Spreading and Core-Concrete Interaction Analyses with MELTSPREAD and CORQUENCH, the MELTSPREAD-CORQUENCH codes predicted the 1F1 core melt readily cooled in contrast to predictions by MELCOR. The user community has taken notice and is in the process of updating their systems codes; specifically MAAP and MELCOR, to improve and reduce conservatism in their ex-vessel core melt models. This report investigates why the MELCOR v2.1 code, compared to the MELTSPREAD and CORQUENCH 3.03 codes, yield differing predictions of ex-vessel melt progression. To accomplish this, the differences in the treatment of the ex-vessel melt with respect to melt spreading and long-term coolability are examined. The differences in modeling approaches are summarized, and a comparison of example code predictions is provided.« less

  9. [Coding Causes of Death with IRIS Software. Impact in Navarre Mortality Statistic].

    PubMed

    Floristán Floristán, Yugo; Delfrade Osinaga, Josu; Carrillo Prieto, Jesus; Aguirre Perez, Jesus; Moreno-Iribas, Conchi

    2016-08-02

    There are few studies that analyze changes in mortality statistics derived from the use of IRIS software, an automatic system for coding multiple causes of death and for the selection of the underlying cause of death, compared to manual coding. This study evaluated the impact of the use of IRIS in the Navarre mortality statistic. We proceeded to double coding 5,060 death certificates corresponding to residents in Navarra in 2014. We calculated coincidence between the two encodings for ICD10 chapters and for the list of causes of the Spanish National Statistics Institute (INE-102) and we estimated the change on mortality rates. IRIS automatically coded 90% of death certificates. The coincidence to 4 characters and in the same chapter of the CIE10 was 79.1% and 92.0%, respectively. Furthermore, coincidence with the short INE-102 list was 88.3%. Higher matches were found in death certificate of people under 65 years. In comparison with manual coding there was an increase in deaths from endocrine diseases (31%), mental disorders (19%) and disease of nervous system (9%), while a decrease of genitourinary system diseases was observed (21%). The coincidence at level of ICD10 chapters coding by IRIS in comparison to manual coding was 9 out of 10 deaths, similar to what is observed in other studies. The implementation of IRIS has led to increased of endocrine diseases, especially diabetes and hyperlipidaemia, and mental disorders, especially dementias.

  10. Evaluation of Spanwise Variable Impedance Liners with Three-Dimensional Aeroacoustics Propagation Codes

    NASA Technical Reports Server (NTRS)

    Jones, M. G.; Watson, W. R.; Nark, D. M.; Schiller, N. H.

    2017-01-01

    Three perforate-over-honeycomb liner configurations, one uniform and two with spanwise variable impedance, are evaluated based on tests conducted in the NASA Grazing Flow Impedance Tube (GFIT) with a plane-wave source. Although the GFIT is only 2" wide, spanwise impedance variability clearly affects the measured acoustic pressure field, such that three-dimensional (3D) propagation codes are required to properly predict this acoustic pressure field. Three 3D propagation codes (CHE3D, COMSOL, and CDL) are used to predict the sound pressure level and phase at eighty-seven microphones flush-mounted in the GFIT (distributed along all four walls). The CHE3D and COMSOL codes compare favorably with the measured data, regardless of whether an exit acoustic pressure or anechoic boundary condition is employed. Except for those frequencies where the attenuation is large, the CDL code also provides acceptable estimates of the measured acoustic pressure profile. The CHE3D and COMSOL predictions diverge slightly from the measured data for frequencies away from resonance, where the attenuation is noticeably reduced, particularly when an exit acoustic pressure boundary condition is used. For these conditions, the CDL code actually provides slightly more favorable comparison with the measured data. Overall, the comparisons of predicted and measured data suggest that any of these codes can be used to understand data trends associated with spanwise variable-impedance liners.

  11. Comparison of Space Shuttle Hot Gas Manifold analysis to air flow data

    NASA Technical Reports Server (NTRS)

    Mcconnaughey, P. K.

    1988-01-01

    This paper summarizes several recent analyses of the Space Shuttle Main Engine Hot Gas Manifold and compares predicted flow environments to air flow data. Codes used in these analyses include INS3D, PAGE, PHOENICS, and VAST. Both laminar (Re = 250, M = 0.30) and turbulent (Re = 1.9 million, M = 0.30) results are discussed, with the latter being compared to data for system losses, outer wall static pressures, and manifold exit Mach number profiles. Comparison of predicted results for the turbulent case to air flow data shows that the analysis using INS3D predicted system losses within 1 percent error, while the PHOENICS, PAGE, and VAST codes erred by 31, 35, and 47 percent, respectively. The INS3D, PHOENICS, and PAGE codes did a reasonable job of predicting outer wall static pressure, while the PHOENICS code predicted exit Mach number profiles with acceptable accuracy. INS3D was approximately an order of magnitude more efficient than the other codes in terms of code speed and memory requirements. In general, it is seen that complex internal flows in manifold-like geometries can be predicted with a limited degree of confidence, and further development is necessary to improve both efficiency and accuracy of codes if they are to be used as design tools for complex three-dimensional geometries.

  12. Comparisons of anomalous and collisional radial transport with a continuum kinetic edge code

    NASA Astrophysics Data System (ADS)

    Bodi, K.; Krasheninnikov, S.; Cohen, R.; Rognlien, T.

    2009-05-01

    Modeling of anomalous (turbulence-driven) radial transport in controlled-fusion plasmas is necessary for long-time transport simulations. Here the focus is continuum kinetic edge codes such as the (2-D, 2-V) transport version of TEMPEST, NEO, and the code being developed by the Edge Simulation Laboratory, but the model also has wider application. Our previously developed anomalous diagonal transport matrix model with velocity-dependent convection and diffusion coefficients allows contact with typical fluid transport models (e.g., UEDGE). Results are presented that combine the anomalous transport model and collisional transport owing to ion drift orbits utilizing a Krook collision operator that conserves density and energy. Comparison is made of the relative magnitudes and possible synergistic effects of the two processes for typical tokamak device parameters.

  13. Modeling the source of GW150914 with targeted numerical-relativity simulations

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey; Lousto, Carlos O.; Healy, James; Scheel, Mark A.; Garcia, Alyssa; O'Shaughnessy, Richard; Boyle, Michael; Campanelli, Manuela; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Szilágyi, Béla; Teukolsky, Saul A.; Zlochower, Yosef

    2016-12-01

    In fall of 2015, the two LIGO detectors measured the gravitational wave signal GW150914, which originated from a pair of merging black holes (Abbott et al Virgo, LIGO Scientific 2016 Phys. Rev. Lett. 116 061102). In the final 0.2 s (about 8 gravitational-wave cycles) before the amplitude reached its maximum, the observed signal swept up in amplitude and frequency, from 35 Hz to 150 Hz. The theoretical gravitational-wave signal for merging black holes, as predicted by general relativity, can be computed only by full numerical relativity, because analytic approximations fail near the time of merger. Moreover, the nearly-equal masses, moderate spins, and small number of orbits of GW150914 are especially straightforward and efficient to simulate with modern numerical-relativity codes. In this paper, we report the modeling of GW150914 with numerical-relativity simulations, using black-hole masses and spins consistent with those inferred from LIGO’s measurement (Abbott et al LIGO Scientific Collaboration, Virgo Collaboration 2016 Phys. Rev. Lett. 116 241102). In particular, we employ two independent numerical-relativity codes that use completely different analytical and numerical methods to model the same merging black holes and to compute the emitted gravitational waveform; we find excellent agreement between the waveforms produced by the two independent codes. These results demonstrate the validity, impact, and potential of current and future studies using rapid-response, targeted numerical-relativity simulations for better understanding gravitational-wave observations.

  14. EO/IR scene generation open source initiative for real-time hardware-in-the-loop and all-digital simulation

    NASA Astrophysics Data System (ADS)

    Morris, Joseph W.; Lowry, Mac; Boren, Brett; Towers, James B.; Trimble, Darian E.; Bunfield, Dennis H.

    2011-06-01

    The US Army Aviation and Missile Research, Development and Engineering Center (AMRDEC) and the Redstone Test Center (RTC) has formed the Scene Generation Development Center (SGDC) to support the Department of Defense (DoD) open source EO/IR Scene Generation initiative for real-time hardware-in-the-loop and all-digital simulation. Various branches of the DoD have invested significant resources in the development of advanced scene and target signature generation codes. The SGDC goal is to maintain unlimited government rights and controlled access to government open source scene generation and signature codes. In addition, the SGDC provides development support to a multi-service community of test and evaluation (T&E) users, developers, and integrators in a collaborative environment. The SGDC has leveraged the DoD Defense Information Systems Agency (DISA) ProjectForge (https://Project.Forge.mil) which provides a collaborative development and distribution environment for the DoD community. The SGDC will develop and maintain several codes for tactical and strategic simulation, such as the Joint Signature Image Generator (JSIG), the Multi-spectral Advanced Volumetric Real-time Imaging Compositor (MAVRIC), and Office of the Secretary of Defense (OSD) Test and Evaluation Science and Technology (T&E/S&T) thermal modeling and atmospherics packages, such as EOView, CHARM, and STAR. Other utility packages included are the ContinuumCore for real-time messaging and data management and IGStudio for run-time visualization and scenario generation.

  15. Development and Implementation of Dynamic Scripts to Execute Cycled WRF/GSI Forecasts

    NASA Technical Reports Server (NTRS)

    Zavodsky, Bradley; Srikishen, Jayanthi; Berndt, Emily; Li, Quanli; Watson, Leela

    2014-01-01

    Automating the coupling of data assimilation (DA) and modeling systems is a unique challenge in the numerical weather prediction (NWP) research community. In recent years, the Development Testbed Center (DTC) has released well-documented tools such as the Weather Research and Forecasting (WRF) model and the Gridpoint Statistical Interpolation (GSI) DA system that can be easily downloaded, installed, and run by researchers on their local systems. However, developing a coupled system in which the various preprocessing, DA, model, and postprocessing capabilities are all integrated can be labor-intensive if one has little experience with any of these individual systems. Additionally, operational modeling entities generally have specific coupling methodologies that can take time to understand and develop code to implement properly. To better enable collaborating researchers to perform modeling and DA experiments with GSI, the Short-term Prediction Research and Transition (SPoRT) Center has developed a set of Perl scripts that couple GSI and WRF in a cycling methodology consistent with the use of real-time, regional observation data from the National Centers for Environmental Prediction (NCEP)/Environmental Modeling Center (EMC). Because Perl is open source, the code can be easily downloaded and executed regardless of the user's native shell environment. This paper will provide a description of this open-source code and descriptions of a number of the use cases that have been performed by SPoRT collaborators using the scripts on different computing systems.

  16. Building A Community Focused Data and Modeling Collaborative platform with Hardware Virtualization Technology

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Wang, W.; Melton, F. S.; Votava, P.; Milesi, C.; Hashimoto, H.; Nemani, R. R.; Hiatt, S. H.

    2009-12-01

    As the length and diversity of the global earth observation data records grow, modeling and analyses of biospheric conditions increasingly requires multiple terabytes of data from a diversity of models and sensors. With network bandwidth beginning to flatten, transmission of these data from centralized data archives presents an increasing challenge, and costs associated with local storage and management of data and compute resources are often significant for individual research and application development efforts. Sharing community valued intermediary data sets, results and codes from individual efforts with others that are not in direct funded collaboration can also be a challenge with respect to time, cost and expertise. We purpose a modeling, data and knowledge center that houses NASA satellite data, climate data and ancillary data where a focused community may come together to share modeling and analysis codes, scientific results, knowledge and expertise on a centralized platform, named Ecosystem Modeling Center (EMC). With the recent development of new technologies for secure hardware virtualization, an opportunity exists to create specific modeling, analysis and compute environments that are customizable, “archiveable” and transferable. Allowing users to instantiate such environments on large compute infrastructures that are directly connected to large data archives may significantly reduce costs and time associated with scientific efforts by alleviating users from redundantly retrieving and integrating data sets and building modeling analysis codes. The EMC platform also provides the possibility for users receiving indirect assistance from expertise through prefabricated compute environments, potentially reducing study “ramp up” times.

  17. Development of new two-dimensional spectral/spatial code based on dynamic cyclic shift code for OCDMA system

    NASA Astrophysics Data System (ADS)

    Jellali, Nabiha; Najjar, Monia; Ferchichi, Moez; Rezig, Houria

    2017-07-01

    In this paper, a new two-dimensional spectral/spatial codes family, named two dimensional dynamic cyclic shift codes (2D-DCS) is introduced. The 2D-DCS codes are derived from the dynamic cyclic shift code for the spectral and spatial coding. The proposed system can fully eliminate the multiple access interference (MAI) by using the MAI cancellation property. The effect of shot noise, phase-induced intensity noise and thermal noise are used to analyze the code performance. In comparison with existing two dimensional (2D) codes, such as 2D perfect difference (2D-PD), 2D Extended Enhanced Double Weight (2D-Extended-EDW) and 2D hybrid (2D-FCC/MDW) codes, the numerical results show that our proposed codes have the best performance. By keeping the same code length and increasing the spatial code, the performance of our 2D-DCS system is enhanced: it provides higher data rates while using lower transmitted power and a smaller spectral width.

  18. A Comprehensive High Performance Predictive Tool for Fusion Liquid Metal Hydromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Peter; Chhabra, Rupanshi; Munipalli, Ramakanth

    In Phase I SBIR project, HyPerComp and Texcel initiated the development of two induction-based MHD codes as a predictive tool for fusion hydro-magnetics. The newly-developed codes overcome the deficiency of other MHD codes based on the quasi static approximation by defining a more general mathematical model that utilizes the induced magnetic field rather than the electric potential as the main electromagnetic variable. The UCLA code is a finite-difference staggered-mesh code that serves as a supplementary tool to the massively-parallel finite-volume code developed by HyPerComp. As there is no suitable experimental data under blanket-relevant conditions for code validation, code-to-code comparisons andmore » comparisons against analytical solutions were successfully performed for three selected test cases: (1) lid-driven MHD flow, (2) flow in a rectangular duct in a transverse magnetic field, and (3) unsteady finite magnetic Reynolds number flow in a rectangular enclosure. The performed tests suggest that the developed codes are accurate and robust. Further work will focus on enhancing the code capabilities towards higher flow parameters and faster computations. At the conclusion of the current Phase-II Project we have completed the preliminary validation efforts in performing unsteady mixed-convection MHD flows (against limited data that is currently available in literature), and demonstrated flow behavior in large 3D channels including important geometrical features. Code enhancements such as periodic boundary conditions, unmatched mesh structures are also ready. As proposed, we have built upon these strengths and explored a much increased range of Grashof numbers and Hartmann numbers under various flow conditions, ranging from flows in a rectangular duct to prototypic blanket modules and liquid metal PFC. Parametric studies, numerical and physical model improvements to expand the scope of simulations, code demonstration, and continued validation activities have also been completed.« less

  19. Towards a measurement of internalization of collaboration scripts in the medical context - results of a pilot study.

    PubMed

    Kiesewetter, Jan; Gluza, Martin; Holzer, Matthias; Saravo, Barbara; Hammitzsch, Laura; Fischer, Martin R

    2015-01-01

    Collaboration as a key qualification in medical education and everyday routine in clinical care can substantially contribute to improving patient safety. Internal collaboration scripts are conceptualized as organized - yet adaptive - knowledge that can be used in specific situations in professional everyday life. This study examines the level of internalization of collaboration scripts in medicine. Internalization is understood as fast retrieval of script information. The goals of the current study were the assessment of collaborative information, which is part of collaboration scripts, and the development of a methodology for measuring the level of internalization of collaboration scripts in medicine. For the contrastive comparison of internal collaboration scripts, 20 collaborative novices (medical students in their final year) and 20 collaborative experts (physicians with specialist degrees in internal medicine or anesthesiology) were included in the study. Eight typical medical collaborative situations as shown on a photo or video were presented to the participants for five seconds each. Afterwards, the participants were asked to describe what they saw on the photo or video. Based on the answers, the amount of information belonging to a collaboration script (script-information) was determined and the time each participant needed for answering was measured. In order to measure the level of internalization, script-information per recall time was calculated. As expected, collaborative experts stated significantly more script-information than collaborative novices. As well, collaborative experts showed a significantly higher level of internalization. Based on the findings of this research, we conclude that our instrument can discriminate between collaboration novices and experts. It therefore can be used to analyze measures to foster subject-specific competency in medical education.

  20. Dual Coding and Bilingual Memory.

    ERIC Educational Resources Information Center

    Paivio, Allan; Lambert, Wallace

    1981-01-01

    Describes study which tested a dual coding approach to bilingual memory using tasks that permit comparison of the effects of bilingual encoding with verbal-nonverbal dual encoding items. Results provide strong support for a version of the independent or separate stories view of bilingual memory. (Author/BK)

  1. Capabilities of LEWICE 1.6 and Comparison With Experimental Data

    DOT National Transportation Integrated Search

    1996-01-01

    A research project is underway at NASA Lewis to produce a computer code which can accurately predict ice growth under any meteorological conditions for any aircraft surface. The most recent release of this code is LEWICE 1.6. This paper will demonstr...

  2. Further Validation of a CFD Code for Calculating the Performance of Two-Stage Light Gas Guns

    NASA Technical Reports Server (NTRS)

    Bogdanoff, David W.

    2017-01-01

    Earlier validations of a higher-order Godunov code for modeling the performance of two-stage light gas guns are reviewed. These validation comparisons were made between code predictions and experimental data from the NASA Ames 1.5" and 0.28" guns and covered muzzle velocities of 6.5 to 7.2 km/s. In the present report, five more series of code validation comparisons involving experimental data from the Ames 0.22" (1.28" pump tube diameter), 0.28", 0.50", 1.00" and 1.50" guns are presented. The total muzzle velocity range of the validation data presented herein is 3 to 11.3 km/s. The agreement between the experimental data and CFD results is judged to be very good. Muzzle velocities were predicted within 0.35 km/s for 74% of the cases studied with maximum differences being 0.5 km/s and for 4 out of 50 cases, 0.5 - 0.7 km/s.

  3. Digital data for quick response (QR) codes of alkalophilic Bacillus pumilus to identify and to compare bacilli isolated from Lonar Crator Lake, India

    PubMed Central

    Rekadwad, Bhagwan N.; Khobragade, Chandrahasya N.

    2016-01-01

    Microbiologists are routinely engaged isolation, identification and comparison of isolated bacteria for their novelty. 16S rRNA sequences of Bacillus pumilus were retrieved from NCBI repository and generated QR codes for sequences (FASTA format and full Gene Bank information). 16SrRNA were used to generate quick response (QR) codes of Bacillus pumilus isolated from Lonar Crator Lake (19° 58′ N; 76° 31′ E), India. Bacillus pumilus 16S rRNA gene sequences were used to generate CGR, FCGR and PCA. These can be used for visual comparison and evaluation respectively. The hyperlinked QR codes, CGR, FCGR and PCA of all the isolates are made available to the users on a portal https://sites.google.com/site/bhagwanrekadwad/. This generated digital data helps to evaluate and compare any Bacillus pumilus strain, minimizes laboratory efforts and avoid misinterpretation of the species. PMID:27141529

  4. Decoding a Decision Process in the Neuronal Population of Dorsal Premotor Cortex.

    PubMed

    Rossi-Pool, Román; Zainos, Antonio; Alvarez, Manuel; Zizumbo, Jerónimo; Vergara, José; Romo, Ranulfo

    2017-12-20

    When trained monkeys discriminate the temporal structure of two sequential vibrotactile stimuli, dorsal premotor cortex (DPC) showed high heterogeneity among its neuronal responses. Notably, DPC neurons coded stimulus patterns as broader categories and signaled them during working memory, comparison, and postponed decision periods. Here, we show that such population activity can be condensed into two major coding components: one that persistently represented in working memory both the first stimulus identity and the postponed informed choice and another that transiently coded the initial sensory information and the result of the comparison between the two stimuli. Additionally, we identified relevant signals that coded the timing of task events. These temporal and task-parameter readouts were shown to be strongly linked to the monkeys' behavior when contrasted to those obtained in a non-demanding cognitive control task and during error trials. These signals, hidden in the heterogeneity, were prominently represented by the DPC population response. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. International contributions to IAEA-NEA heat transfer databases for supercritical fluids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leung, L. K. H.; Yamada, K.

    2012-07-01

    An IAEA Coordinated Research Project on 'Heat Transfer Behaviour and Thermohydraulics Code Testing for SCWRs' is being conducted to facilitate collaboration and interaction among participants from 15 organizations. While the project covers several key technology areas relevant to the development of SCWR concepts, it focuses mainly on the heat transfer aspect, which has been identified as the most challenging. Through the collaborating effort, large heat-transfer databases have been compiled for supercritical water and surrogate fluids in tubes, annuli, and bundle subassemblies of various orientations over a wide range of flow conditions. Assessments of several supercritical heat-transfer correlations were performed usingmore » the complied databases. The assessment results are presented. (authors)« less

  6. Accuracy of the new ICD-9-CM code for "drip-and-ship" thrombolytic treatment in patients with ischemic stroke.

    PubMed

    Tonarelli, Silvina B; Tibbs, Michael; Vazquez, Gabriela; Lakshminarayan, Kamakshi; Rodriguez, Gustavo J; Qureshi, Adnan I

    2012-02-01

    A new International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnosis code, V45.88, was approved by the Centers for Medicare and Medicaid Services (CMS) on October 1, 2008. This code identifies patients in whom intravenous (IV) recombinant tissue plasminogen activator (rt-PA) is initiated in one hospital's emergency department, followed by transfer within 24 hours to a comprehensive stroke center, a paradigm commonly referred to as "drip-and-ship." This study assessed the use and accuracy of the new V45.88 code for identifying ischemic stroke patients who meet the criteria for drip-and-ship at 2 advanced certified primary stroke centers. Consecutive patients over a 12-month period were identified by primary ICD-9-CM diagnosis codes related to ischemic stroke. The accuracy of V45.88 code utilization using administrative data provided by Health Information Management Services was assessed through a comparison with data collected in prospective stroke registries maintained at each hospital by a trained abstractor. Out of a total of 428 patients discharged from both hospitals with a diagnosis of ischemic stroke, 37 patients were given ICD-9-CM code V45.88. The internally validated data from the prospective stroke database demonstrated that a total of 40 patients met the criteria for drip-and-ship. A concurrent comparison found that 92% (sensitivity) of the patients treated with drip-and-ship were coded with V45.88. None of the non-drip-and-ship stroke cases received the V45.88 code (100% specificity). The new ICD-9-CM code for drip-and-ship appears to have high specificity and sensitivity, allowing effective data collection by the CMS. Copyright © 2012 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  7. Performance Comparison of Orthogonal and Quasi-orthogonal Codes in Quasi-Synchronous Cellular CDMA Communication

    NASA Astrophysics Data System (ADS)

    Jos, Sujit; Kumar, Preetam; Chakrabarti, Saswat

    Orthogonal and quasi-orthogonal codes are integral part of any DS-CDMA based cellular systems. Orthogonal codes are ideal for use in perfectly synchronous scenario like downlink cellular communication. Quasi-orthogonal codes are preferred over orthogonal codes in the uplink communication where perfect synchronization cannot be achieved. In this paper, we attempt to compare orthogonal and quasi-orthogonal codes in presence of timing synchronization error. This will give insight into the synchronization demands in DS-CDMA systems employing the two classes of sequences. The synchronization error considered is smaller than chip duration. Monte-Carlo simulations have been carried out to verify the analytical and numerical results.

  8. Does Robotic Telerounding Enhance Nurse-Physician Collaboration Satisfaction About Care Decisions?

    PubMed

    Bettinelli, Michele; Lei, Yuxiu; Beane, Matt; Mackey, Caleb; Liesching, Timothy N

    2015-08-01

    Delivering healthcare using remote robotic telepresence is an evolving practice in medical and surgical intensive critical care units and will likely have varied implications for work practices and working relationships in intensive care units. Our study assessed the nurse-physician collaboration satisfaction about care decisions from surgical intensive critical care nurses during remote robotic telepresence night rounds in comparison with conventional telephone night rounds. This study used a randomized trial to test whether robotic telerounding enhances the nurse-physician collaboration satisfaction about care decisions. A physician randomly used either the conventional telephone or the RP-7 robot (InTouch(®) Health, Santa Barbara, CA) to perform nighttime rounding in a surgical intensive care unit. The Collaboration and Satisfaction About Care Decisions (CSACD) survey instrument was used to measure the nurse-physician collaboration. The CSACD scores were compared using the signed-rank test with a significant p value of ≤0.05. From December 1, 2011 to December 13, 2012, 20 off-shift nurses submitted 106 surveys during telephone rounds and 108 surveys during robot rounds. The median score of surveys during robot rounds was slightly but not significantly higher than telephone rounds (51.3 versus 50.5; p=0.3). However, the CSACD score was significantly increased from baseline with robot rounds (51.3 versus 43.0; p=0.01), in comparison with telephone rounds (50.5 versus 43.0; p=0.09). The mediators, including age, working experience, and robot acceptance, were not significantly (p>0.1) correlated with the CSACD score difference (robot versus telephone). Robot rounding in the intensive care unit was comparable but not superior to the telephone in regard to the nurse-physician collaboration and satisfaction about care decision. The working experience and technology acceptance of intensive care nurses did not contribute to the preference of night shift rounding method from the aspect of collaboration with the physician about care decision-making.

  9. Revisiting Yasinsky and Henry`s benchmark using modern nodal codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feltus, M.A.; Becker, M.W.

    1995-12-31

    The numerical experiments analyzed by Yasinsky and Henry are quite trivial by comparison with today`s standards because they used the finite difference code WIGLE for their benchmark. Also, this problem is a simple slab (one-dimensional) case with no feedback mechanisms. This research attempts to obtain STAR (Ref. 2) and NEM (Ref. 3) code results in order to produce a more modern kinetics benchmark with results comparable WIGLE.

  10. EMPIRE: A code for nuclear astrophysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palumbo, A.

    The nuclear reaction code EMPIRE is presented as a useful tool for nuclear astrophysics. EMPIRE combines a variety of the reaction models with a comprehensive library of input parameters providing a diversity of options for the user. With exclusion of the directsemidirect capture all reaction mechanisms relevant to the nuclear astrophysics energy range of interest are implemented in the code. Comparison to experimental data show consistent agreement for all relevant channels.

  11. COCOA code for creating mock observations of star cluster models

    NASA Astrophysics Data System (ADS)

    Askar, Abbas; Giersz, Mirek; Pych, Wojciech; Dalessandro, Emanuele

    2018-04-01

    We introduce and present results from the COCOA (Cluster simulatiOn Comparison with ObservAtions) code that has been developed to create idealized mock photometric observations using results from numerical simulations of star cluster evolution. COCOA is able to present the output of realistic numerical simulations of star clusters carried out using Monte Carlo or N-body codes in a way that is useful for direct comparison with photometric observations. In this paper, we describe the COCOA code and demonstrate its different applications by utilizing globular cluster (GC) models simulated with the MOCCA (MOnte Carlo Cluster simulAtor) code. COCOA is used to synthetically observe these different GC models with optical telescopes, perform point spread function photometry, and subsequently produce observed colour-magnitude diagrams. We also use COCOA to compare the results from synthetic observations of a cluster model that has the same age and metallicity as the Galactic GC NGC 2808 with observations of the same cluster carried out with a 2.2 m optical telescope. We find that COCOA can effectively simulate realistic observations and recover photometric data. COCOA has numerous scientific applications that maybe be helpful for both theoreticians and observers that work on star clusters. Plans for further improving and developing the code are also discussed in this paper.

  12. Collaboration in Referential Communication: Comparison of Youth with Down Syndrome or Fragile X Syndrome

    ERIC Educational Resources Information Center

    Abbeduto, Leonard; Murphy, Melissa M.; Richmond, Erica K.; Amman, Adrienne; Beth, Patti; Weissman, Michelle D.; Kim, Jee-Seon; Cawthon, Stephanie W.; Karadottir, Selma

    2006-01-01

    Referential communication was examined in youth with Down syndrome or fragile X syndrome in comparison to each other and to MA-matched typically developing children. A non-face-to-face task was used in which the participant repeatedly described novel shapes to listeners. Several dimensions of referential communication were especially challenging…

  13. Factors That Impact the Success of Interorganizational Health Promotion Collaborations: A Scoping Review.

    PubMed

    Seaton, Cherisse L; Holm, Nikolai; Bottorff, Joan L; Jones-Bricker, Margaret; Errey, Sally; Caperchione, Cristina M; Lamont, Sonia; Johnson, Steven T; Healy, Theresa

    2018-05-01

    To explore published empirical literature in order to identify factors that facilitate or inhibit collaborative approaches for health promotion using a scoping review methodology. A comprehensive search of MEDLINE, CINAHL, ScienceDirect, PsycINFO, and Academic Search Complete for articles published between January 2001 and October 2015 was conducted in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. To be included studies had to: be an original research article, published in English, involve at least 2 organizations in a health promotion partnership, and identify factors contributing to or constraining the success of an established (or prior) partnership. Studies were excluded if they focused on primary care collaboration or organizations jointly lobbying for a cause. Data extraction was completed by 2 members of the author team using a summary chart to extract information relevant to the factors that facilitated or constrained collaboration success. NVivo 10 was used to code article content into the thematic categories identified in the data extraction. Twenty-five studies across 8 countries were identified. Several key factors contributed to collaborative effectiveness, including a shared vision, leadership, member characteristics, organizational commitment, available resources, clear roles/responsibilities, trust/clear communication, and engagement of the target population. In general, the findings were consistent with previous reviews; however, additional novel themes did emerge.

  14. Sociologists and energy engineers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verhapen, F.C.

    1982-07-01

    Contents include sociology and sociologists; sociologists and energy (history of sociological focus on energy; the sociological profession's official interest in energy; sociological specialties with contributions to the study and planning of energy; potential sociological contributions); functional areas of collaboration between energy sociologists and energy establishment; obstacles to collaboration between consulting energy sociologists and their clients in the energy establishment; overcoming obstacles in the collaboration; advantages and disadvantages in the greater use of sociological resources by the energy establishment; and tables (e.g. comparison of cost-benefit analysis and social impact assessment; two policy science paradigms; selected technical values for electrical energy generation,more » transmission and distribution).« less

  15. Articulating the ideal: 50 years of interprofessional collaboration in Medical Education.

    PubMed

    Paradis, Elise; Pipher, Mandy; Cartmill, Carrie; Rangel, J Cristian; Whitehead, Cynthia R

    2017-08-01

    Health care delivery and the education of clinicians have changed immensely since the creation of the journal Medical Education. In this project, we seek to answer the following three questions: How has the concept of collaboration changed over the past 50 years in Medical Education? Have the participants involved in collaboration shifted over time? Has the idea of collaboration itself been transformed over the past 50 years? Starting from a constructionist view of scientific discourse, we used directed content analysis to sample, code and analyse 144 collaboration-related articles over the 50-year life span of Medical Education. We developed an analytical framework to identify the key components of varying articulations of 'collaboration', with a focus on shifts in language and terminology over time. Our sample was drawn from an archive of 1221 articles developed to celebrate the 50th anniversary of Medical Education. Interprofessional collaboration is conceptualised in three primary ways throughout our sample: as a psychometric property; as tasks or activities, and, more recently, as 'togetherness'. The first conceptualisation articulates collaboration as involving knowledge or skills that are teachable to individuals, the second as involving the education of teams to engage in structured meetings or task distribution, and the third as the building of networks of individuals who learn to form team identities. The 'leader' of collaboration is typically conceptualised as the doctor, who is consistently articulated by authors as the active agent of collaborative care. Other clinicians and students of other professions are, as the wording in this sentence suggests, usually positioned as 'others', and thus as more passive participants in, or even observers of, 'collaboration'. In order to meet goals of meaningful collaboration leading to higher-quality care, it behoves us as a community of educators and researchers to heed the ways in which we teach, think and write about interprofessional collaboration, interrogating our own language and assumptions that may be betraying and reproducing harmful care hierarchies. © 2017 John Wiley & Sons Ltd and The Association for the Study of Medical Education.

  16. SU-F-BRD-02: Application of ARCHERRT-- A GPU-Based Monte Carlo Dose Engine for Radiation Therapy -- to Tomotherapy and Patient-Independent IMRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, L; Du, X; Liu, T

    Purpose: As a module of ARCHER -- Accelerated Radiation-transport Computations in Heterogeneous EnviRonments, ARCHER{sub RT} is designed for RadioTherapy (RT) dose calculation. This paper describes the application of ARCHERRT on patient-dependent TomoTherapy and patient-independent IMRT. It also conducts a 'fair' comparison of different GPUs and multicore CPU. Methods: The source input used for patient-dependent TomoTherapy is phase space file (PSF) generated from optimized plan. For patient-independent IMRT, the open filed PSF is used for different cases. The intensity modulation is simulated by fluence map. The GEANT4 code is used as benchmark. DVH and gamma index test are employed to evaluatemore » the accuracy of ARCHER{sub RT} code. Some previous studies reported misleading speedups by comparing GPU code with serial CPU code. To perform a fairer comparison, we write multi-thread code with OpenMP to fully exploit computing potential of CPU. The hardware involved in this study are a 6-core Intel E5-2620 CPU and 6 NVIDIA M2090 GPUs, a K20 GPU and a K40 GPU. Results: Dosimetric results from ARCHER{sub RT} and GEANT4 show good agreement. The 2%/2mm gamma test pass rates for different clinical cases are 97.2% to 99.7%. A single M2090 GPU needs 50~79 seconds for the simulation to achieve a statistical error of 1% in the PTV. The K40 card is about 1.7∼1.8 times faster than M2090 card. Using 6 M2090 card, the simulation can be finished in about 10 seconds. For comparison, Intel E5-2620 needs 507∼879 seconds for the same simulation. Conclusion: We successfully applied ARCHER{sub RT} to Tomotherapy and patient-independent IMRT, and conducted a fair comparison between GPU and CPU performance. The ARCHER{sub RT} code is both accurate and efficient and may be used towards clinical applications.« less

  17. Validation and Performance Comparison of Numerical Codes for Tsunami Inundation

    NASA Astrophysics Data System (ADS)

    Velioglu, D.; Kian, R.; Yalciner, A. C.; Zaytsev, A.

    2015-12-01

    In inundation zones, tsunami motion turns from wave motion to flow of water. Modelling of this phenomenon is a complex problem since there are many parameters affecting the tsunami flow. In this respect, the performance of numerical codes that analyze tsunami inundation patterns becomes important. The computation of water surface elevation is not sufficient for proper analysis of tsunami behaviour in shallow water zones and on land and hence for the development of mitigation strategies. Velocity and velocity patterns are also crucial parameters and have to be computed at the highest accuracy. There are numerous numerical codes to be used for simulating tsunami inundation. In this study, FLOW 3D and NAMI DANCE codes are selected for validation and performance comparison. Flow 3D simulates linear and nonlinear propagating surface waves as well as long waves by solving three-dimensional Navier-Stokes (3D-NS) equations. FLOW 3D is used specificaly for flood problems. NAMI DANCE uses finite difference computational method to solve linear and nonlinear forms of shallow water equations (NSWE) in long wave problems, specifically tsunamis. In this study, these codes are validated and their performances are compared using two benchmark problems which are discussed in 2015 National Tsunami Hazard Mitigation Program (NTHMP) Annual meeting in Portland, USA. One of the problems is an experiment of a single long-period wave propagating up a piecewise linear slope and onto a small-scale model of the town of Seaside, Oregon. Other benchmark problem is an experiment of a single solitary wave propagating up a triangular shaped shelf with an island feature located at the offshore point of the shelf. The computed water surface elevation and velocity data are compared with the measured data. The comparisons showed that both codes are in fairly good agreement with each other and benchmark data. All results are presented with discussions and comparisons. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement No 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)

  18. Posttest calculation of the PBF LOC-11B and LOC-11C experiments using RELAP4/MOD6. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendrix, C.E.

    Comparisons between RELAP4/MOD6, Update 4 code-calculated and measured experimental data are presented for the PBF LOC-11C and LOC-11B experiments. Independent code verification techniques are now being developed and this study represents a preliminary effort applying structured criteria for developing computer models, selecting code input, and performing base-run analyses. Where deficiencies are indicated in the base-case representation of the experiment, methods of code and criteria improvement are developed and appropriate recommendations are made.

  19. Magnetohydrodynamic modelling of exploding foil initiators

    NASA Astrophysics Data System (ADS)

    Neal, William

    2015-06-01

    Magnetohydrodynamic (MHD) codes are currently being developed, and used, to predict the behaviour of electrically-driven flyer-plates. These codes are of particular interest to the design of exploding foil initiator (EFI) detonators but there is a distinct lack of comparison with high-fidelity experimental data. This study aims to compare a MHD code with a collection of temporally and spatially resolved diagnostics including PDV, dual-axis imaging and streak imaging. The results show the code's excellent representation of the flyer-plate launch and highlight features within the experiment that the model fails to capture.

  20. Multi-D Full Boltzmann Neutrino Hydrodynamic Simulations in Core Collapse Supernovae and their detailed comparison with Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Nagakura, Hiroki; Richers, Sherwood; Ott, Christian; Iwakami, Wakana; Furusawa, Shun; Sumiyoshi, Kohsuke; Yamada, Shoichi

    2017-01-01

    We have developed a multi-d radiation-hydrodynamic code which solves first-principles Boltzmann equation for neutrino transport. It is currently applicable specifically for core-collapse supernovae (CCSNe), but we will extend their applicability to further extreme phenomena such as black hole formation and coalescence of double neutron stars. In this meeting, I will discuss about two things; (1) detailed comparison with a Monte-Carlo neutrino transport (2) axisymmetric CCSNe simulations. The project (1) gives us confidence of our code. The Monte-Carlo code has been developed by Caltech group and it is specialized to obtain a steady state. Among CCSNe community, this is the first attempt to compare two different methods for multi-d neutrino transport. I will show the result of these comparison. For the project (2), I particularly focus on the property of neutrino distribution function in the semi-transparent region where only first-principle Boltzmann solver can appropriately handle the neutrino transport. In addition to these analyses, I will also discuss the ``explodability'' by neutrino heating mechanism.

Top