Science.gov

Sample records for integrated codes hedric

  1. Parameters used in the environmental pathways and radiological dose modules (DESCARTES, CIDER, and CRD codes) of the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC)

    SciTech Connect

    Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.

    1994-05-01

    This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site during the period of 1944 to 1992. This work is being done by staff at Battelle, Pacific Northwest Laboratories under a contract with the Centers for Disease Control and Prevention with technical direction provided by an independent Technical Steering Panel (TSP).

  2. Parameters used in the environmental pathways (DESCARTES) and radiological dose (CIDER) modules of the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC) for the air pathway

    SciTech Connect

    Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.

    1992-09-01

    This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site since 1944. This work is being done by staff at Battelle, Pacific Northwest Laboratories (Battelle) under a contract with the Centers for Disease Control (CDC) with technical direction provided by an independent Technical Steering Panel (TSP). The objective of this report is to-document the environmental accumulation and dose-assessment parameters that will be used to estimate the impacts of past Hanford Site airborne releases. During 1993, dose estimates made by staff at Battelle will be used by the Fred Hutchinson Cancer Research Center as part of the Hanford Thyroid Disease Study (HTDS). This document contains information on parameters that are specific to the airborne release of the radionuclide iodine-131. Future versions of this document will include parameter information pertinent to other pathways and radionuclides.

  3. Integrated Codes for Estimating Environmental Accumulation and Individual Dose from Past Hanford Atmospheric Releases: Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Ikenberry, T. A.; Burnett, R. A.; Napier, B. A.; Reitz, N. A.; Shipler, D. B.

    1992-02-01

    Preliminary radiation doses were estimated and reported during Phase I of the Hanford Environmental Dose Reconstruction (HEDR) Project. As the project has progressed, additional information regarding the magnitude and timing of past radioactive releases has been developed, and the general scope of the required calculations has been enhanced. The overall HEDR computational model for computing doses attributable to atmospheric releases from Hanford Site operations is called HEDRIC (Hanford Environmental Dose Reconstruction Integrated Codes). It consists of four interrelated models: source term, atmospheric transport, environmental accumulation, and individual dose. The source term and atmospheric transport models are documented elsewhere. This report describes the initial implementation of the design specifications for the environmental accumulation model and computer code, called DESCARTES (Dynamic EStimates of Concentrations and Accumulated Radionuclides in Terrestrial Environments), and the individual dose model and computer code, called CIDER (Calculation of Individual Doses from Environmental Radionuclides). The computations required of these models and the design specifications for their codes were documented in Napier et al. (1992). Revisions to the original specifications and the basis for modeling decisions are explained. This report is not the final code documentation but gives the status of the model and code development to date. Final code documentation is scheduled to be completed in FY 1994 following additional code upgrades and refinements. The user's guide included in this report describes the operation of the environmental accumulation and individual dose codes and associated pre- and post-processor programs. A programmer's guide describes the logical structure of the programs and their input and output files.

  4. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  5. CBP PHASE I CODE INTEGRATION

    SciTech Connect

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  6. The Fireball integrated code package

    SciTech Connect

    Dobranich, D.; Powers, D.A.; Harper, F.T.

    1997-07-01

    Many deep-space satellites contain a plutonium heat source. An explosion, during launch, of a rocket carrying such a satellite offers the potential for the release of some of the plutonium. The fireball following such an explosion exposes any released plutonium to a high-temperature chemically-reactive environment. Vaporization, condensation, and agglomeration processes can alter the distribution of plutonium-bearing particles. The Fireball code package simulates the integrated response of the physical and chemical processes occurring in a fireball and the effect these processes have on the plutonium-bearing particle distribution. This integrated treatment of multiple phenomena represents a significant improvement in the state of the art for fireball simulations. Preliminary simulations of launch-second scenarios indicate: (1) most plutonium vaporization occurs within the first second of the fireball; (2) large non-aerosol-sized particles contribute very little to plutonium vapor production; (3) vaporization and both homogeneous and heterogeneous condensation occur simultaneously; (4) homogeneous condensation transports plutonium down to the smallest-particle sizes; (5) heterogeneous condensation precludes homogeneous condensation if sufficient condensation sites are available; and (6) agglomeration produces larger-sized particles but slows rapidly as the fireball grows.

  7. Parameters used in the environmental pathways (DESCARTES) and radiological dose (CIDER) modules of the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC) for the air pathway. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.

    1992-09-01

    This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site since 1944. This work is being done by staff at Battelle, Pacific Northwest Laboratories (Battelle) under a contract with the Centers for Disease Control (CDC) with technical direction provided by an independent Technical Steering Panel (TSP). The objective of this report is to-document the environmental accumulation and dose-assessment parameters that will be used to estimate the impacts of past Hanford Site airborne releases. During 1993, dose estimates made by staff at Battelle will be used by the Fred Hutchinson Cancer Research Center as part of the Hanford Thyroid Disease Study (HTDS). This document contains information on parameters that are specific to the airborne release of the radionuclide iodine-131. Future versions of this document will include parameter information pertinent to other pathways and radionuclides.

  8. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  9. MINET (momentum integral network) code documentation

    SciTech Connect

    Van Tuyle, G J; Nepsee, T C; Guppy, J G

    1989-12-01

    The MINET computer code, developed for the transient analysis of fluid flow and heat transfer, is documented in this four-part reference. In Part 1, the MINET models, which are based on a momentum integral network method, are described. The various aspects of utilizing the MINET code are discussed in Part 2, The User's Manual. The third part is a code description, detailing the basic code structure and the various subroutines and functions that make up MINET. In Part 4, example input decks, as well as recent validation studies and applications of MINET are summarized. 32 refs., 36 figs., 47 tabs.

  10. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  11. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  12. Integrating Proactive Discipline Practices into Codes of Conduct

    ERIC Educational Resources Information Center

    Fenning, Pamela; Theodos, Jennifer; Benner, Courtney; Bohanon-Edmonson, Hank

    2004-01-01

    The purpose of this article is to advocate for proactive content in discipline codes of conduct. Proactive discipline codes integrate Positive Behavior Support (PBS) strategies (Sugai & Horner, 2002) and attend to the academic needs of students (McEvoy & Welker, 2000). Proactive discipline codes of conduct have meaningful participation by key…

  13. The UNC Charlotte Code of Student Academic Integrity.

    ERIC Educational Resources Information Center

    Toenjes, Richard H.

    The University of North Carolina at Charlotte has developed a Code of Student Academic Integrity as a solution for problems of academic dishonesty in large, young, public, urban universities where traditional honor codes will not work. The Code defines violations and gives typical examples. It provides a friendly settlement procedure in which a…

  14. Definite Integrals, Some Involving Residue Theory Evaluated by Maple Code

    SciTech Connect

    Bowman, Kimiko o

    2010-01-01

    The calculus of residue is applied to evaluate certain integrals in the range (-{infinity} to {infinity}) using the Maple symbolic code. These integrals are of the form {integral}{sub -{infinity}}{sup {infinity}} cos(x)/[(x{sup 2} + a{sup 2})(x{sup 2} + b{sup 2}) (x{sup 2} + c{sup 2})]dx and similar extensions. The Maple code is also applied to expressions in maximum likelihood estimator moments when sampling from the negative binomial distribution. In general the Maple code approach to the integrals gives correct answers to specified decimal places, but the symbolic result may be extremely long and complex.

  15. Evaluation of CERT Secure Coding Rules through Integration with Source Code Analysis Tools

    DTIC Science & Technology

    2008-06-01

    Evaluation of CERT Secure Coding Rules through Integration with Source Code Analysis Tools Stephen Dewhurst Chad Dougherty Yurie Ito David...numbers FA8721-05-C-0003 6. author(s) Stephen Dewhurst , Chad Dougherty, Yurie Ito, David Keaton, Dan Saks, Robert C. Seacord, David Svoboda, Chris

  16. Boundary layer integral matrix procedure code modifications and verifications

    NASA Technical Reports Server (NTRS)

    Evans, R. M.; Morse, H. L.

    1974-01-01

    A summary of modifications to Aerotherm's Boundary Layer Integral Matrix Procedure (BLIMP) code is presented. These modifications represent a preliminary effort to make BLIMP compatible with other JANNAF codes and to adjust the code for specific application to rocket nozzle flows. Results of the initial verification of the code for prediction of rocket nozzle type flows are discussed. For those cases in which measured free stream flow conditions were used as input to the code, the boundary layer predictions and measurements are in excellent agreement. In two cases, with free stream flow conditions calculated by another JANNAF code (TDK) for use as input to BLIMP, the predictions and the data were in fair agreement for one case and in poor agreement for the other case. The poor agreement is believed to result from failure of the turbulent model in BLIMP to account for laminarization of a turbulent flow. Recommendations for further code modifications and improvements are also presented.

  17. A new integrated symmetrical table for genetic codes.

    PubMed

    Shu, Jian-Jun

    2017-01-01

    Degeneracy is a salient feature of genetic codes, because there are more codons than amino acids. The conventional table for genetic codes suffers from an inability of illustrating a symmetrical nature among genetic base codes. In fact, because the conventional wisdom avoids the question, there is little agreement as to whether the symmetrical nature actually even exists. A better understanding of symmetry and an appreciation for its essential role in the genetic code formation can improve our understanding of nature's coding processes. Thus, it is worth formulating a new integrated symmetrical table for genetic codes, which is presented in this paper. It could be very useful to understand the Nobel laureate Crick's wobble hypothesis - how one transfer ribonucleic acid can recognize two or more synonymous codons, which is an unsolved fundamental question in biological science.

  18. Progress in Advanced Spray Combustion Code Integration

    NASA Technical Reports Server (NTRS)

    Liang, Pak-Yan

    1993-01-01

    A multiyear project to assemble a robust, muitiphase spray combustion code is now underway and gradually building up to full speed. The overall effort involves several university and government research teams as well as Rocketdyne. The first part of this paper will give an overview of the respective roles of the different participants involved, the master strategy, the evolutionary milestones, and an assessment of the state-of-the-art of various key components. The second half of this paper will highlight the progress made to date in extending the baseline Navier-Stokes solver to handle multiphase, multispecies, chemically reactive sub- to supersonic flows. The major hurdles to overcome in order to achieve significant speed ups are delineated and the approaches to overcoming them will be discussed.

  19. Radiation hydrodynamics integrated in the PLUTO code

    NASA Astrophysics Data System (ADS)

    Kolb, Stefan M.; Stute, Matthias; Kley, Wilhelm; Mignone, Andrea

    2013-11-01

    Aims: The transport of energy through radiation is very important in many astrophysical phenomena. In dynamical problems the time-dependent equations of radiation hydrodynamics have to be solved. We present a newly developed radiation-hydrodynamics module specifically designed for the versatile magnetohydrodynamic (MHD) code PLUTO. Methods: The solver is based on the flux-limited diffusion approximation in the two-temperature approach. All equations are solved in the co-moving frame in the frequency-independent (gray) approximation. The hydrodynamics is solved by the different Godunov schemes implemented in PLUTO, and for the radiation transport we use a fully implicit scheme. The resulting system of linear equations is solved either using the successive over-relaxation (SOR) method (for testing purposes) or using matrix solvers that are available in the PETSc library. We state in detail the methodology and describe several test cases to verify the correctness of our implementation. The solver works in standard coordinate systems, such as Cartesian, cylindrical, and spherical, and also for non-equidistant grids. Results: We present a new radiation-hydrodynamics solver coupled to the MHD-code PLUTO that is a modern, versatile, and efficient new module for treating complex radiation hydrodynamical problems in astrophysics. As test cases, either purely radiative situations, or full radiation-hydrodynamical setups (including radiative shocks and convection in accretion disks) were successfully studied. The new module scales very well on parallel computers using MPI. For problems in star or planet formation, we added the possibility of irradiation by a central source.

  20. Honor Codes: Evidence Based Strategies for Improving Academic Integrity

    ERIC Educational Resources Information Center

    Tatum, Holly; Schwartz, Beth M.

    2017-01-01

    Although there is evidence of cheating at all levels of education, institutions often do not implement or design integrity policies, such as honor codes, to prevent and adjudicate academic dishonesty. Further, faculty members rarely discuss academic integrity expectations or policies with their students. When cheating does occur, faculty members…

  1. Foundational development of an advanced nuclear reactor integrated safety code.

    SciTech Connect

    Clarno, Kevin; Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth; Hooper, Russell Warren; Humphries, Larry LaRon

    2010-02-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  2. Method of optical image coding by time integration

    NASA Astrophysics Data System (ADS)

    Evtikhiev, Nikolay N.; Starikov, Sergey N.; Cheryomkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.

    2012-06-01

    Method of optical image coding by time integration is proposed. Coding in proposed method is accomplished by shifting object image over photosensor area of digital camera during registration. It results in optically calculated convolution of original image with shifts trajectory. As opposed to optical coding methods based on the use of diffractive optical elements the described coding method is feasible for implementation in totally incoherent light. The method was preliminary tested by using LC monitor for image displaying and shifting. Shifting of object image is realized by displaying video consisting of frames with image to be encoded at different locations on screen of LC monitor while registering it by camera. Optical encoding and numerical decoding of test images were performed successfully. Also more practical experimental implementation of the method with use of LCOS SLM Holoeye PLUTO VIS was realized. Objects images to be encoded were formed in monochromatic spatially incoherent light. Shifting of object image over camera photosensor area was accomplished by displaying video consisting of frames with blazed gratings on LCOS SLM. Each blazed grating deflects reflecting from SLM light at different angle. Results of image optical coding and encoded images numerical restoration are presented. Obtained experimental results are compared with results of numerical modeling. Optical image coding with time integration could be used for accessible quality estimation of optical image coding using diffractive optical elements or as independent optical coding method which can be implemented in incoherent light.

  3. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    SciTech Connect

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  4. Integration of channel coding and modulation in the ARQ protocols

    NASA Astrophysics Data System (ADS)

    Benelli, G.

    1988-12-01

    The integration of the modulation operation in the structure of some automatic-repeat-request protocols is analyzed. Continuous phase modulation schemes are considered together with block and convolutional codes. The optimum structure of channel coding and modulation is determined through a computer search. The results show that a net improvement in the throughput and error probability can be achieved by using this technique with respect to the classical protocols.

  5. Faculty and Academic Integrity: The Influence of Current Honor Codes and Past Honor Code Experiences.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Butterfield, Kenneth D.; Trevino, Linda Klebe

    2003-01-01

    Found that faculty at honor-code schools have more positive attitudes toward their schools' academic integrity policies and allow the system to take care of monitoring and disciplinary activities. Faculty in noncode institutions have less positive attitudes and are more likely to take personal actions designed to deal with cheaters. Faculty in…

  6. Integrating Renewable Energy Requirements Into Building Energy Codes

    SciTech Connect

    Kaufmann, John R.; Hand, James R.; Halverson, Mark A.

    2011-07-01

    This report evaluates how and when to best integrate renewable energy requirements into building energy codes. The basic goals were to: (1) provide a rough guide of where we’re going and how to get there; (2) identify key issues that need to be considered, including a discussion of various options with pros and cons, to help inform code deliberations; and (3) to help foster alignment among energy code-development organizations. The authors researched current approaches nationally and internationally, conducted a survey of key stakeholders to solicit input on various approaches, and evaluated the key issues related to integration of renewable energy requirements and various options to address those issues. The report concludes with recommendations and a plan to engage stakeholders. This report does not evaluate whether the use of renewable energy should be required on buildings; that question involves a political decision that is beyond the scope of this report.

  7. Beyond Honour Codes: Bringing Students into the Academic Integrity Equation

    ERIC Educational Resources Information Center

    Richards, Deborah; Saddiqui, Sonia; McGuigan, Nicholas; Homewood, Judi

    2016-01-01

    Honour codes represent a successful and unique, student-led, "bottom-up" approach to the promotion of academic integrity (AI). With increased flexibility, globalisation and distance or blended education options, most institutions operate in very different climates and cultures from the US institutions that have a long-established culture…

  8. Second Generation Integrated Composite Analyzer (ICAN) Computer Code

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.

    1993-01-01

    This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.

  9. Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics

    NASA Astrophysics Data System (ADS)

    Takabe, Hideaki

    This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (1D & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.

  10. Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics

    NASA Astrophysics Data System (ADS)

    Takabe, Hideaki

    2016-10-01

    This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (ID & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.

  11. Automated Code Engine for Graphical Processing Units: Application to the Effective Core Potential Integrals and Gradients.

    PubMed

    Song, Chenchen; Wang, Lee-Ping; Martínez, Todd J

    2016-01-12

    We present an automated code engine (ACE) that automatically generates optimized kernels for computing integrals in electronic structure theory on a given graphical processing unit (GPU) computing platform. The code generator in ACE creates multiple code variants with different memory and floating point operation trade-offs. A graph representation is created as the foundation of the code generation, which allows the code generator to be extended to various types of integrals. The code optimizer in ACE determines the optimal code variant and GPU configurations for a given GPU computing platform by scanning over all possible code candidates and then choosing the best-performing code candidate for each kernel. We apply ACE to the optimization of effective core potential integrals and gradients. It is observed that the best code candidate varies with differing angular momentum, floating point precision, and type of GPU being used, which shows that the ACE may be a powerful tool in adapting to fast evolving GPU architectures.

  12. Interfacing modules for integrating discipline specific structural mechanics codes

    NASA Technical Reports Server (NTRS)

    Endres, Ned M.

    1989-01-01

    An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.

  13. Committed to the Honor Code: An Investment Model Analysis of Academic Integrity

    ERIC Educational Resources Information Center

    Dix, Emily L.; Emery, Lydia F.; Le, Benjamin

    2014-01-01

    Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…

  14. Committed to the Honor Code: An Investment Model Analysis of Academic Integrity

    ERIC Educational Resources Information Center

    Dix, Emily L.; Emery, Lydia F.; Le, Benjamin

    2014-01-01

    Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…

  15. Integration of the QMSFRG Database into the HZETRN Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Shavers, M. R.; Tripathi, R. K.; Wilson, J. W.

    2001-01-01

    Accurate nuclear interaction data bases are needed for describing the transport of space radiation in matter including space craft structures, atmospheres, and tissues. Transport models support the identification and development of new material concepts for human and electronic part protection. Quantum effects are manifested in nuclear reactions in several ways including interference effects between terms in the multiple scattering series, the many-body nuclear wave functions (for e.g. the roles of shell structure and Fermi momentum) and nuclear clustering. The quantum multiple scattering fragmentation model (QMSFRG) is a comprehensive model for generating nuclear interaction databases for galactic cosmic ray (GCR) transport. Other nuclear databases including the NUCFRG model and Monte-Carlo simulation codes such as FLUKA, LAHET, HETC, and GEANT ignore quantum effects. These codes fail to describe many important features of nuclear reactions and are thus inaccurate for the evaluation of materials for radiation protection. Previously we have shown that quantum effects are manifested through constructive interference in forward production spectra, the effects of Fermi momentum on production spectra, cluster nuclei knockout, and the nuclear response function. Quantum effects are especially important for heavy ions with mass numbers less than 20 that dominate radiation transport in human tissues and for the materials that are expected to be superior in space radiation protection. We describe the integration of the QMSFRG model into the HZETRN transport code. Integration milestones include proper treatment of odd-even charge-mass effects in nuclear fragmentation and the momentum distribution of nucleon production from GCR primary heavy ions. We have also modified the two-body amplitudes in the model to include nuclear medium effects. In order to include a comprehensive description of the GCR isotopic composition in materials, we have described the isotopic composition

  16. Code System to Calculate Integral Parameters with Reaction Rates from WIMS Output.

    SciTech Connect

    LESZCZYNSKI, FRANCISCO

    1994-10-25

    Version 00 REACTION calculates different integral parameters related to neutron reactions on reactor lattices, from reaction rates calculated with WIMSD4 code, and comparisons with experimental values.

  17. HCPCS Coding: An Integral Part of Your Reimbursement Strategy

    PubMed Central

    Nusgart, Marcia

    2013-01-01

    The first step to a successful reimbursement strategy is to ensure that your wound care product has the most appropriate Healthcare Common Procedure Coding System (HCPCS) code (or billing) for your product. The correct HCPCS code plays an essential role in patient access to new and existing technologies. When devising a strategy to obtain a HCPCS code for its product, companies must consider a number of factors as follows: (1) Has the product gone through the Food and Drug Administration (FDA) regulatory process or does it need to do so? Will the FDA code designation impact which HCPCS code will be assigned to your product? (2) In what “site of service” do you intend to market your product? Where will your customers use the product? Which coding system (CPT® or HCPCS) applies to your product? (3) Does a HCPCS code for a similar product already exist? Does your product fit under the existing HCPCS code? (4) Does your product need a new HCPCS code? What is the linkage, if any, between coding, payment, and coverage for the product? Researchers and companies need to start early and place the same emphasis on a reimbursement strategy as it does on a regulatory strategy. Your reimbursement strategy staff should be involved early in the process, preferably during product research and development and clinical trial discussions. PMID:24761331

  18. HCPCS Coding: An Integral Part of Your Reimbursement Strategy.

    PubMed

    Nusgart, Marcia

    2013-12-01

    The first step to a successful reimbursement strategy is to ensure that your wound care product has the most appropriate Healthcare Common Procedure Coding System (HCPCS) code (or billing) for your product. The correct HCPCS code plays an essential role in patient access to new and existing technologies. When devising a strategy to obtain a HCPCS code for its product, companies must consider a number of factors as follows: (1) Has the product gone through the Food and Drug Administration (FDA) regulatory process or does it need to do so? Will the FDA code designation impact which HCPCS code will be assigned to your product? (2) In what "site of service" do you intend to market your product? Where will your customers use the product? Which coding system (CPT(®) or HCPCS) applies to your product? (3) Does a HCPCS code for a similar product already exist? Does your product fit under the existing HCPCS code? (4) Does your product need a new HCPCS code? What is the linkage, if any, between coding, payment, and coverage for the product? Researchers and companies need to start early and place the same emphasis on a reimbursement strategy as it does on a regulatory strategy. Your reimbursement strategy staff should be involved early in the process, preferably during product research and development and clinical trial discussions.

  19. [Usefulness of nuclear medicine extension code keeping the integrity with JJ1017].

    PubMed

    Shibutani, Takayuki; Tsushima, Hiroyuki; Shimizu, Keiji; Hanaoka, Kohei; Matsuda, Shigeo; Jinguji, Koji; Sakurai, Minoru; Katou, Seiji; Takeda, Satoru; Kuwano, Tadao; Fujisawa, Ichiro; Takehana, Kazuya; Oku, Shinya

    2013-02-01

    Working group on JJ1017 nuclear medicine domain extension code in the Japanese Society of Nuclear Medicine has created nuclear medicine extension codes keeping the integrity with JJ1017. The objective of this study was to investigate the usefulness of nuclear medicine extension codes in real clinical settings. Nuclear medicine examinations of each institution were extracted from the examination master table and then the target subset of examinations to be coded with JJ1017 were identified. For this subset, working process was conducted, during which the followings compared conformity rate, application rate of representative frequently code set and compliance rate of nuclear medicine extension codes. Without using representative frequently code set, it was difficult to invent the same code for the same examination. By using the representative frequently code set, the same code expression could be invented for the same examination. Furthermore, using nuclear medicine extension codes additionally, these which could not be appropriately coded with representative frequently code set alone. Nuclear medicine extension codes keeping the integrity with JJ1017, was proved to be useful to improve the accuracy of coding.

  20. A new 3-D integral code for computation of accelerator magnets

    SciTech Connect

    Turner, L.R.; Kettunen, L.

    1991-01-01

    For computing accelerator magnets, integral codes have several advantages over finite element codes; far-field boundaries are treated automatically, and computed field in the bore region satisfy Maxwell's equations exactly. A new integral code employing edge elements rather than nodal elements has overcome the difficulties associated with earlier integral codes. By the use of field integrals (potential differences) as solution variables, the number of unknowns is reduced to one less than the number of nodes. Two examples, a hollow iron sphere and the dipole magnet of Advanced Photon Source injector synchrotron, show the capability of the code. The CPU time requirements are comparable to those of three-dimensional (3-D) finite-element codes. Experiments show that in practice it can realize much of the potential CPU time saving that parallel processing makes possible. 8 refs., 4 figs., 1 tab.

  1. Pre-Service Teachers' Perception of Quick Response (QR) Code Integration in Classroom Activities

    ERIC Educational Resources Information Center

    Ali, Nagla; Santos, Ieda M.; Areepattamannil, Shaljan

    2017-01-01

    Quick Response (QR) codes have been discussed in the literature as adding value to teaching and learning. Despite their potential in education, more research is needed to inform practice and advance knowledge in this field. This paper investigated the integration of the QR code in classroom activities and the perceptions of the integration by…

  2. Boltzmann Transport Code Update: Parallelization and Integrated Design Updates

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.

    2003-01-01

    The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.

  3. Neutronic calculation of fast reactors by the EUCLID/V1 integrated code

    NASA Astrophysics Data System (ADS)

    Koltashev, D. A.; Stakhanova, A. A.

    2017-01-01

    This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.

  4. A general multiblock Euler code for propulsion integration. Volume 3: User guide for the Euler code

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Su, T. Y.; Kao, T. J.

    1991-01-01

    This manual explains the procedures for using the general multiblock Euler (GMBE) code developed under NASA contract NAS1-18703. The code was developed for the aerodynamic analysis of geometrically complex configurations in either free air or wind tunnel environments (vol. 1). The complete flow field is divided into a number of topologically simple blocks within each of which surface fitted grids and efficient flow solution algorithms can easily be constructed. The multiblock field grid is generated with the BCON procedure described in volume 2. The GMBE utilizes a finite volume formulation with an explicit time stepping scheme to solve the Euler equations. A multiblock version of the multigrid method was developed to accelerate the convergence of the calculations. This user guide provides information on the GMBE code, including input data preparations with sample input files and a sample Unix script for program execution in the UNICOS environment.

  5. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    SciTech Connect

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  6. SPACE code simulation of cold leg small break LOCA in the ATLAS integral test

    SciTech Connect

    Kim, B. J.; Kim, H. T.; Kim, J.; Kim, K. D.

    2012-07-01

    SPACE code is a system analysis code for pressurized water reactors. This code uses a two-fluid and three-field model. For a few years, intensive validations have been performed to secure the prediction accuracy of models and correlations for two-phase flow and heat transfer. Recently, the code version 1.0 was released. This study is to see how well SPACE code predicts thermal hydraulic phenomena of an integral effect test. The target experiment is a cold leg small break LOCA in the ATLAS facility, which has the same two-loop features as APR1400. Predicted parameters were compared with experimental observations. (authors)

  7. Validating the BISON fuel performance code to integral LWR experiments

    DOE PAGES

    Williamson, R. L.; Gamble, K. A.; Perez, D. M.; ...

    2016-03-24

    BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to datemore » for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to

  8. Validating the BISON fuel performance code to integral LWR experiments

    SciTech Connect

    Williamson, R. L.; Gamble, K. A.; Perez, D. M.; Novascone, S. R.; Pastore, G.; Gardner, R. J.; Hales, J. D.; Liu, W.; Mai, A.

    2016-03-24

    BISON is a modern finite element-based nuclear fuel performance code that has been under development at the Idaho National Laboratory (INL) since 2009. The code is applicable to both steady and transient fuel behavior and has been used to analyze a variety of fuel forms in 1D spherical, 2D axisymmetric, or 3D geometries. Code validation is underway and is the subject of this study. A brief overview of BISON’s computational framework, governing equations, and general material and behavioral models is provided. BISON code and solution verification procedures are described, followed by a summary of the experimental data used to date for validation of Light Water Reactor (LWR) fuel. Validation comparisons focus on fuel centerline temperature, fission gas release, and rod diameter both before and following fuel-clad mechanical contact. Comparisons for 35 LWR rods are consolidated to provide an overall view of how the code is predicting physical behavior, with a few select validation cases discussed in greater detail. Our results demonstrate that 1) fuel centerline temperature comparisons through all phases of fuel life are very reasonable with deviations between predictions and experimental data within ±10% for early life through high burnup fuel and only slightly out of these bounds for power ramp experiments, 2) accuracy in predicting fission gas release appears to be consistent with state-of-the-art modeling and with the involved uncertainties and 3) comparison of rod diameter results indicates a tendency to overpredict clad diameter reduction early in life, when clad creepdown dominates, and more significantly overpredict the diameter increase late in life, when fuel expansion controls the mechanical response. In the initial rod diameter comparisons they were unsatisfactory and have lead to consideration of additional separate effects experiments to better understand and predict clad and fuel mechanical behavior. Results from this study are being used to define

  9. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  10. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  11. Deciphering the Code for Retroviral Integration Target Site Selection

    PubMed Central

    Santoni, Federico Andrea; Hartley, Oliver; Luban, Jeremy

    2010-01-01

    Upon cell invasion, retroviruses generate a DNA copy of their RNA genome and integrate retroviral cDNA within host chromosomal DNA. Integration occurs throughout the host cell genome, but target site selection is not random. Each subgroup of retrovirus is distinguished from the others by attraction to particular features on chromosomes. Despite extensive efforts to identify host factors that interact with retrovirion components or chromosome features predictive of integration, little is known about how integration sites are selected. We attempted to identify markers predictive of retroviral integration by exploiting Precision-Recall methods for extracting information from highly skewed datasets to derive robust and discriminating measures of association. ChIPSeq datasets for more than 60 factors were compared with 14 retroviral integration datasets. When compared with MLV, PERV or XMRV integration sites, strong association was observed with STAT1, acetylation of H3 and H4 at several positions, and methylation of H2AZ, H3K4, and K9. By combining peaks from ChIPSeq datasets, a supermarker was identified that localized within 2 kB of 75% of MLV proviruses and detected differences in integration preferences among different cell types. The supermarker predicted the likelihood of integration within specific chromosomal regions in a cell-type specific manner, yielding probabilities for integration into proto-oncogene LMO2 identical to experimentally determined values. The supermarker thus identifies chromosomal features highly favored for retroviral integration, provides clues to the mechanism by which retrovirus integration sites are selected, and offers a tool for predicting cell-type specific proto-oncogene activation by retroviruses. PMID:21124862

  12. Imaging with the coded aperture gamma-ray spectrometer SPI aboard INTEGRAL

    NASA Astrophysics Data System (ADS)

    Wunderer, Cornelia B.; Strong, Andrew W.; Attie, David; von Ballmoos, Peter; Connell, Paul; Cordier, Bertrand; Diehl, Roland; Hammer, J. Wolfgang; Jean, Pierre; von Kienlin, Andreas; Knoedlseder, Juergen; Lichti, Giselher G.; Mandrou, Pierre; Paul, Jaques; Paul, Philippe; Reglero, Victor; Roques, Jean-Pierre; Sanchez, Filomeno; Schanne, Stephane; Schoenfelder, Volker; Shrader, Chris; Skinner, Gerald K.; Sturner, Steven J.; Teegarden, Bonnard J.; Vedrenne, Gilbert; Weidenspointner, Georg

    2003-03-01

    ESA's INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) will be launched in October 2002. Its two main instruments are the imager IBIS and the spectrometer SPI. Both emply coded apertures to obtain directional information on the incoming radiation. SPI's detection plane consists of 19 hexagonal Ge detectors, its coded aperture has 63 tungsten-alloy elements of 30 mm thickness.

  13. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    SciTech Connect

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation tools is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.

  14. Experimental assessment of computer codes used for safety analysis of integral reactors

    SciTech Connect

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B.

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  15. EMdeCODE: a novel algorithm capable of reading words of epigenetic code to predict enhancers and retroviral integration sites and to identify H3R2me1 as a distinctive mark of coding versus non-coding genes

    PubMed Central

    Santoni, Federico Andrea

    2013-01-01

    Existence of some extra-genetic (epigenetic) codes has been postulated since the discovery of the primary genetic code. Evident effects of histone post-translational modifications or DNA methylation over the efficiency and the regulation of DNA processes are supporting this postulation. EMdeCODE is an original algorithm that approximate the genomic distribution of given DNA features (e.g. promoter, enhancer, viral integration) by identifying relevant ChIPSeq profiles of post-translational histone marks or DNA binding proteins and combining them in a supermark. EMdeCODE kernel is essentially a two-step procedure: (i) an expectation-maximization process calculates the mixture of epigenetic factors that maximize the Sensitivity (recall) of the association with the feature under study; (ii) the approximated density is then recursively trimmed with respect to a control dataset to increase the precision by reducing the number of false positives. EMdeCODE densities improve significantly the prediction of enhancer loci and retroviral integration sites with respect to previous methods. Importantly, it can also be used to extract distinctive factors between two arbitrary conditions. Indeed EMdeCODE identifies unexpected epigenetic profiles specific for coding versus non-coding RNA, pointing towards a new role for H3R2me1 in coding regions. PMID:23234700

  16. Integrated Codes Model for Erosion-Deposition in Long Discharges

    SciTech Connect

    Hogan, John T

    2006-08-01

    There is increasing interest in understanding the mechanisms causing the deuterium retention rates which are observed in the longest high power tokamak discharges, and its possible relation to near term choices which must be made for plasma-facing components in next generation devices [1]. Both co-deposition and bulk diffusion models are regarded as potentially relevant. This contribution describes a global model for the co-depositio axis of this dilemma, which includes as many of the relevant processes which may contribute to it as is computationally feasible, following the 'maximal ordering / minimal simplification' strategy described in Kruskal's "Asymptotology" [2]. The global model is interpretative, meaning that some key information describing the bulk plasma is provided by experimental measurement, and the models for the impurity processes relevant to retention, given this measured background, are simulated and compared with other data. In particular, the model describes the carbon balance in near steady-state systems, to be able to understand the relation between retention in present devices and the level which might be expected in fusion reactors, or precursor experiments such as ITER. The key modules of the global system describe impurity generation, their transport in and through the SOL, and core impurity transport. The codes IMPFLU, BBQ, and ITC/MIST, in order of the appearance of the processes they describe, are used to calculate the balance: IMPFLU is an adaptation of the TOKAFLU module of CAST3M [3], developed by CEA, which is a 3-D, time-dependent finite elements code which determines the thermal and mechanical properties of plasma-facing components. BBQ [4, 5] is a Monte Carlo guiding center code which describes trace impurity transport in a 3-D defined-plasma background, to calculate observables (line emission) for comparison with spectroscopy. ITC [6] and MIST [7] are radial core multi-species impurity transport codes. The modules are linked

  17. Whose Code of Conduct Matters Most? Examining the Link between Academic Integrity and Student Development

    ERIC Educational Resources Information Center

    Biswas, Ann E.

    2013-01-01

    Although most colleges strive to nurture a culture of integrity, incidents of dishonest behavior are on the rise. This article examines the role student development plays in students' perceptions of academic dishonesty and in their willingness to adhere to a code of conduct that may be in sharp contrast to traditional integrity policies.

  18. A System for Coding Integration and Differentiation Messages (SID): Operationalizing Inclusion.

    ERIC Educational Resources Information Center

    Krueger, Dorothy Lenk

    A "System for coding Integration and Differentiation messages" in group communication (SID) was developed, based on the theoretical and empirical work of W. C. Schutz, W. Bennis and H. Shepard, and A. Koestler. In SID, integrating messages are defined as those dealing with material internal to the group and having positive affect, or…

  19. Data integration of structured and unstructured sources for assigning clinical codes to patient stays

    PubMed Central

    Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-01-01

    Objective Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Methods Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. Results When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Discussion Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. Conclusions We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. PMID:26316458

  20. Data integration of structured and unstructured sources for assigning clinical codes to patient stays.

    PubMed

    Scheurwegs, Elyne; Luyckx, Kim; Luyten, Léon; Daelemans, Walter; Van den Bulcke, Tim

    2016-04-01

    Enormous amounts of healthcare data are becoming increasingly accessible through the large-scale adoption of electronic health records. In this work, structured and unstructured (textual) data are combined to assign clinical diagnostic and procedural codes (specifically ICD-9-CM) to patient stays. We investigate whether integrating these heterogeneous data types improves prediction strength compared to using the data types in isolation. Two separate data integration approaches were evaluated. Early data integration combines features of several sources within a single model, and late data integration learns a separate model per data source and combines these predictions with a meta-learner. This is evaluated on data sources and clinical codes from a broad set of medical specialties. When compared with the best individual prediction source, late data integration leads to improvements in predictive power (eg, overall F-measure increased from 30.6% to 38.3% for International Classification of Diseases, Ninth Revision, Clinical Modification (ICD-9-CM) diagnostic codes), while early data integration is less consistent. The predictive strength strongly differs between medical specialties, both for ICD-9-CM diagnostic and procedural codes. Structured data provides complementary information to unstructured data (and vice versa) for predicting ICD-9-CM codes. This can be captured most effectively by the proposed late data integration approach. We demonstrated that models using multiple electronic health record data sources systematically outperform models using data sources in isolation in the task of predicting ICD-9-CM codes over a broad range of medical specialties. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Transport analysis in toroidal helical plasmas using the integrated code: TASK3D

    NASA Astrophysics Data System (ADS)

    Wakasa, A.; Fukuyama, A.; Murakami, S.; Beidler, C. D.; Maassberg, H.; Yokoyama, M.; Sato, M.

    2009-11-01

    The integrated simulation code in helical plasmas, TASK3D, is being developed on the basis of an integrated modeling code for tokamak plasma, TASK. In helical systems, the neoclassical transport is one of the important issues in addition to the anomalous transport, because of strong temperature dependence of heat conductivity and an important role in determining the radial electric field. We have already constructed the neoclassical transport database in LHD, DGN/LHD. The mono-energetic diffusion coefficients are evaluated based on the Monte Carlo method by DCOM code and the mono-energetic diffusion coefficients database is constructed using a neural network technique. Also we apply GSRAKE code, which solves the ripple-averaged drift kinetic equation, to obtain transport coefficients in highly collisionless regime. We have newly incorporated the DGN/LHD module into TASK3D. We will present several results of transport simulation in typical LHD plasmas.

  2. Final Report. An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group

    SciTech Connect

    Rosenthal, Andrew

    2013-12-30

    The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stove pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.

  3. Design and realization of the miniature long-life integrative coded sun sensor

    NASA Astrophysics Data System (ADS)

    Mo, Yanan; Cui, Jian; Zhao, Yuan; Chen, Ran; Liu, Xin

    2013-10-01

    This paper describes the research activity at the Beijing institute of control engineering about the miniature long-life integrative coded sun sensor. The light system of the miniature coded sun sensor is composed with a semi-column silex glass, a cube silex with coded shape on the bottom and an integrative silicon battery with 14 cells. The sun line forms a light spot through the slit on light system on the coded plate. The sensor determines the orientation of sun through the position of light spot. With the limitation of the diameter of sun plate the accuracy of only 0.5° can be realized with 8-bit coarse code in FOV of 124°. To achieve high accuracy of 0.05° the subdivision technique must be adopted. The main scheme of the miniature long-life integrative coded sun sensor is integrating the light system and the signal processing circuits in one mechanical house, using FPGA to calculate the angle, generate the control signal of Multiplexer and AD and realize the function of UART, using flexibility board to connect analog board and digital board, using second power of the satellite, using RS422 interface to communicate with central computer. The performance of the miniature long-life integrative coded sun sensor is listed as below : FOV 124°x124°,accuracy 0.05°(3σ), resolution 14″, power consumption 0.5W,update rate 40Hz,mass 475g, designed life-time 15 years. It has been adopted in the new platform of Remote Sensing Satellite of CAST. The first flight will be at 2015.

  4. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    SciTech Connect

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  5. BAYESIAN CALIBRATION OF SAFETY CODES USING DATA FROM SEPARATE-AND INTEGRAL EFFECTS TESTS

    SciTech Connect

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-04-01

    Large-scale system codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. In order to be able to use the results of these simulation codes with confidence, it is important to learn how the uncertainty on the values of these parameters affects the output of the codes. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the safety code, and thereby improves its support for decision-making. Modern analysis capabilities afford very significant improvements on classical ways of doing calibration, and the work reported here implements some of those improvements. The key innovation has come from development of safety code surrogate model (code emulator) construction and prediction algorithms. A surrogate is needed for calibration of plant-scale simulation codes because the multivariate nature of the problem (i.e., the need to adjust multiple uncertain parameters at once to fit multiple pieces of new information) calls for multiple evaluations of performance, which, for a computation-intensive model, makes calibration very computation-intensive. Use of a fast surrogate makes the calibration processes used here with Markov Chain Monte Carlo (MCMC) sampling feasible. Moreover, most traditional surrogates do not provide uncertainty information along with their predictions, but the Gaussian Process (GP) based code surrogates used here do. This improves the soundness of the code calibration process. Results are demonstrated on a simplified scenario with data from Separate and Integral Effect Tests.

  6. An Integrated Program Structure and System of Account Codes for PPBS in Local School Districts.

    ERIC Educational Resources Information Center

    Miller, Donald R.

    This monograph presents a comprehensive but tentative matrix program structure and system of account codes that have been integrated to facilitate the implementation of PPB systems in local school districts. It is based on the results of an extensive analysis of K-12 public school district programs and management practices. In its entirety, the…

  7. A long-term, integrated impact assessment of alternative building energy code scenarios in China

    SciTech Connect

    Yu, Sha; Eom, Jiyong; Evans, Meredydd; Clarke, Leon E.

    2014-04-01

    China is the second largest building energy user in the world, ranking first and third in residential and commercial energy consumption. Beginning in the early 1980s, the Chinese government has developed a variety of building energy codes to improve building energy efficiency and reduce total energy demand. This paper studies the impact of building energy codes on energy use and CO2 emissions by using a detailed building energy model that represents four distinct climate zones each with three building types, nested in a long-term integrated assessment framework GCAM. An advanced building stock module, coupled with the building energy model, is developed to reflect the characteristics of future building stock and its interaction with the development of building energy codes in China. This paper also evaluates the impacts of building codes on building energy demand in the presence of economy-wide carbon policy. We find that building energy codes would reduce Chinese building energy use by 13% - 22% depending on building code scenarios, with a similar effect preserved even under the carbon policy. The impact of building energy codes shows regional and sectoral variation due to regionally differentiated responses of heating and cooling services to shell efficiency improvement.

  8. Users Guide to SAMINT: A Code for Nuclear Data Adjustment with SAMMY Based on Integral Experiments

    SciTech Connect

    Sobes, Vladimir; Leal, Luiz C.; Arbanas, Goran

    2014-10-01

    The purpose of this project is to couple differential and integral data evaluation in a continuous-energy framework. More specifically, the goal is to use the Generalized Linear Least Squares methodology employed in TSURFER to update the parameters of a resolved resonance region evaluation directly. Recognizing that the GLLS methodology in TSURFER is identical to the mathematical description of the simple Bayesian updating carried out in SAMMY, the computer code SAMINT was created to help use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Minimal modifications of SAMMY are required when used with SAMINT to make resonance parameter updates based on integral experimental data.

  9. Semantic Integration and Age of Acquisition Effects in Code-Blend Comprehension

    PubMed Central

    Emmorey, Karen

    2016-01-01

    Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust facilitation effects were observed for semantic decision than for lexical decision, suggesting that lexical integration of signs and words within a code-blend occurs primarily at the semantic level, rather than at the level of form. Early bilinguals exhibited greater facilitation effects than late bilinguals for English (the dominant language) in the semantic decision task, possibly because early bilinguals are better able to process early visual cues from ASL signs and use these to constrain English word recognition. Comprehension facilitation via semantic integration of words and signs is consistent with co-speech gesture research demonstrating facilitative effects of gesture integration on language comprehension. PMID:26657077

  10. An Integration of the Restructured Melcor for the Midas Computer Code

    SciTech Connect

    Sunhee Park; Dong Ha Kim; Ko-Ryu Kim; Song-Won Cho

    2006-07-01

    The developmental need for a localized severe accident analysis code is on the rise. KAERI is developing a severe accident code called MIDAS, which is based on MELCOR. In order to develop the localized code (MIDAS) which simulates a severe accident in a nuclear power plant, the existing data structure is reconstructed for all the packages in MELCOR, which uses pointer variables for data transfer between the packages. During this process, new features in FORTRAN90 such as a dynamic allocation are used for an improved data saving and transferring method. Hence the readability, maintainability and portability of the MIDAS code have been enhanced. After the package-wise restructuring, the newly converted packages are integrated together. Depending on the data usage in the package, two types of packages can be defined: some use their own data within the package (let's call them independent packages) and the others share their data with other packages (dependent packages). For the independent packages, the integration process is simple to link the already converted packages together. That is, the package-wise structuring does not require further conversion of variables for the integration process. For the dependent packages, extra conversion is necessary to link them together. As the package-wise restructuring converts only the corresponding package's variables, other variables defined from other packages are not touched and remain as it is. These variables are to be converted into the new types of variables simultaneously as well as the main variables in the corresponding package. Then these dependent packages are ready for integration. In order to check whether the integration process is working well, the results from the integrated version are verified against the package-wise restructured results. Steady state runs and station blackout sequences are tested and the major variables are found to be the same each other. In order to verify the results, the integrated

  11. Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system

    NASA Technical Reports Server (NTRS)

    Appa, Kari; Smith, Michael J. C.

    1987-01-01

    A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.

  12. TOPICAL REVIEW: The CRONOS suite of codes for integrated tokamak modelling

    NASA Astrophysics Data System (ADS)

    Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Schneider, M.; Garcia, J.; Giruzzi, G.; Huynh, P.; Aniel, T.; Albajar, F.; Ané, J. M.; Bécoulet, A.; Bourdelle, C.; Casati, A.; Colas, L.; Decker, J.; Dumont, R.; Eriksson, L. G.; Garbet, X.; Guirlet, R.; Hertout, P.; Hoang, G. T.; Houlberg, W.; Huysmans, G.; Joffrin, E.; Kim, S. H.; Köchl, F.; Lister, J.; Litaudon, X.; Maget, P.; Masset, R.; Pégourié, B.; Peysson, Y.; Thomas, P.; Tsitrone, E.; Turco, F.

    2010-04-01

    CRONOS is a suite of numerical codes for the predictive/interpretative simulation of a full tokamak discharge. It integrates, in a modular structure, a 1D transport solver with general 2D magnetic equilibria, several heat, particle and impurities transport models, as well as heat, particle and momentum sources. This paper gives a first comprehensive description of the CRONOS suite: overall structure of the code, main available models, details on the simulation workflow and numerical implementation. Some examples of applications to the analysis of experimental discharges and the predictions of ITER scenarios are also given.

  13. Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Trevino, Linda Klebe; Butterfield, Kenneth D.

    2002-01-01

    Investigated the influence of modified honor codes, an alternative to traditional codes that is gaining popularity on larger campuses. Also tested the model of student academic dishonesty previously suggested by McCabe and Trevino. Found that modified honor codes are associated with lower levels of student dishonesty and that the McCabe Trevino…

  14. Multiphase integral reacting flow computer code (ICOMFLO): User`s guide

    SciTech Connect

    Chang, S.L.; Lottes, S.A.; Petrick, M.

    1997-11-01

    A copyrighted computational fluid dynamics computer code, ICOMFLO, has been developed for the simulation of multiphase reacting flows. The code solves conservation equations for gaseous species and droplets (or solid particles) of various sizes. General conservation laws, expressed by elliptic type partial differential equations, are used in conjunction with rate equations governing the mass, momentum, enthalpy, species, turbulent kinetic energy, and turbulent dissipation. Associated phenomenological submodels of the code include integral combustion, two parameter turbulence, particle evaporation, and interfacial submodels. A newly developed integral combustion submodel replacing an Arrhenius type differential reaction submodel has been implemented to improve numerical convergence and enhance numerical stability. A two parameter turbulence submodel is modified for both gas and solid phases. An evaporation submodel treats not only droplet evaporation but size dispersion. Interfacial submodels use correlations to model interfacial momentum and energy transfer. The ICOMFLO code solves the governing equations in three steps. First, a staggered grid system is constructed in the flow domain. The staggered grid system defines gas velocity components on the surfaces of a control volume, while the other flow properties are defined at the volume center. A blocked cell technique is used to handle complex geometry. Then, the partial differential equations are integrated over each control volume and transformed into discrete difference equations. Finally, the difference equations are solved iteratively by using a modified SIMPLER algorithm. The results of the solution include gas flow properties (pressure, temperature, density, species concentration, velocity, and turbulence parameters) and particle flow properties (number density, temperature, velocity, and void fraction). The code has been used in many engineering applications, such as coal-fired combustors, air

  15. The Integral PWR SIR Transients: Comparisons Between CATHARE and RELAP Codes

    SciTech Connect

    Pignatel, Jean-Francois

    2002-07-01

    Within the framework of the research program on innovative light water reactors, the SERI (Service of Studies on Innovative Reactors) of the French Atomic Energy Commission (CEA), is presenting a predictive study on the modeling of a low-power integral Pressurized Water Reactor, using the CATHARE thermalhydraulic code. The concept selected for this study is that of the SIR reactor project, developed by AEA-T and ABB consortium. This very interesting concept is no doubt that which is the most complete to this date, and on which most information in the literature can be obtained. Many safety calculations made with the RELAP code are also available and represent a highly interesting base for comparison purposes, in order to improve the approach on the results obtained with CATHARE. A comparison of the behavior of the two codes is thus presented in this article. This study therefore shows that CATHARE finely models this type of new PWR concept. The transients studied cover a large area, ranging from natural circulation to loss of primary coolant accidents. The ATWS and a power transient have also been calculated. The comparison made between the CATHARE and RELAP results shows a very good agreement between the two codes, and leads to a very positive conclusion on the pertinence of simulating an integral PWR. Moreover, even though this study is a thorough investigation on the subject, it confirms the potentially safe nature of the SIR reactor. (author)

  16. Integrated Fuel-Coolant Interaction (IFCI 6.0) code. User`s manual

    SciTech Connect

    Davis, F.J.; Young, M.F.

    1994-04-01

    The integrated Fuel-Coolant interaction (IFCI) computer code is being developed at Sandia National Laboratories to investigate the fuel-coolant interaction (FCI) problem at large scale using a two-dimensional, four-field hydrodynamic framework and physically based models. IFCI will be capable of treating all major FCI processes in an integrated manner. This document is a product of the effort to generate a stand-alone version of IFCI, IFCI 6.0. The User`s Manual describes in detail the hydrodynamic method and physical models used in IFCI 6.0. Appendix A is an input manual, provided for the creation of working decks.

  17. Integrated Fuel-Coolant Interaction (IFCI 7.0) Code User's Manual

    SciTech Connect

    Young, Michael F.

    1999-05-01

    The integrated fuel-coolant interaction (IFCI) computer code is being developed at Sandia National Laboratories to investigate the fuel-coolant interaction (FCI) problem at large scale using a two-dimensional, three-field hydrodynamic framework and physically based models. IFCI will be capable of treating all major FCI processes in an integrated manner. This document is a description of IFCI 7.0. The user's manual describes the hydrodynamic method and physical models used in IFCI 7.0. Appendix A is an input manual provided for the creation of working decks.

  18. Explicit time-reversible orbit integration in Particle In Cell codes with static homogeneous magnetic field

    NASA Astrophysics Data System (ADS)

    Patacchini, L.; Hutchinson, I. H.

    2009-04-01

    A new explicit time-reversible orbit integrator for the equations of motion in a static homogeneous magnetic field - called Cyclotronic integrator - is presented. Like Spreiter and Walter's Taylor expansion algorithm, for sufficiently weak electric field gradients this second order method does not require a fine resolution of the Larmor motion; it has however the essential advantage of being symplectic, hence time-reversible. The Cyclotronic integrator is only subject to a linear stability constraint ( ΩΔ t < π, Ω being the Larmor angular frequency), and is therefore particularly suitable to electrostatic Particle In Cell codes with uniform magnetic field where Ω is larger than any other characteristic frequency, yet a resolution of the particles' gyromotion is required. Application examples and a detailed comparison with the well-known (time-reversible) Boris algorithm are presented; it is in particular shown that implementation of the Cyclotronic integrator in the kinetic codes SCEPTIC and Democritus can reduce the cost of orbit integration by up to a factor of ten.

  19. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    SciTech Connect

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  20. Applicability of GALE-86 Codes to Integral Pressurized Water Reactor designs

    SciTech Connect

    Geelhood, Kenneth J.; Rishel, Jeremy P.

    2012-06-01

    This report describes work that Pacific Northwest National Laboratory is doing to assist the U.S. Nuclear Regulatory Commission (NRC) Office of New Reactors (NRO) staff in their reviews of applications for nuclear power plants using new reactor core designs. These designs include small integral PWRs (IRIS, mPower, and NuScale reactor designs), HTGRs, (pebble-bed and prismatic-block modular reactor designs) and SFRs (4S and PRISM reactor designs). Under this specific task, PNNL will assist the NRC staff in reviewing the current versions of the GALE codes and identify features and limitations that would need to be modified to accommodate the technical review of iPWR and mPower® license applications and recommend specific changes to the code, NUREG-0017, and associated NRC guidance. This contract is necessary to support the licensing of iPWRs with a near-term focus on the B&W mPower® reactor design. While the focus of this review is on the mPower® reactor design, the review of the code and the scope of recommended changes consider a revision of the GALE codes that would make them universally applicable for other types of integral PWR designs. The results of a detailed comparison between PWR and iPWR designs are reported here. Also included is an investigation of the GALE code and its basis and a determination as to the applicability of each of the bases to an iPWR design. The issues investigated come from a list provided by NRC staff, the results of comparing the PWR and iPWR designs, the parameters identified as having a large impact on the code outputs from a recent sensitivity study and the main bases identified in NUREG-0017. This report will provide a summary of the gaps in the GALE codes as they relate to iPWR designs and for each gap will propose what work could be performed to fill that gap and create a version of GALE that is applicable to integral PWR designs.

  1. Integrated TIGER Series of Coupled Electron/Photon Monte Carlo Transport Codes System.

    SciTech Connect

    VALDEZ, GREG D.

    2012-11-30

    Version: 00 Distribution is restricted to US Government Agencies and Their Contractors Only. The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 95. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  2. Leap frog integrator modifications in highly collisional particle-in-cell codes

    NASA Astrophysics Data System (ADS)

    Hanzlikova, N.; Turner, M. M.

    2014-07-01

    Leap frog integration method is a standard, simple, fast, and accurate way to implement velocity and position integration in particle-in-cell codes. Due to the direct solution of kinetics of particles in phase space central to the particle-in-cell procedure, important information can be obtained on particle velocity distributions, and consequently on transport and heating processes. This approach is commonly associated with physical situations where collisional effects are weak, but can also be profitably applied in some highly collisional cases, such as occur in semiconductor devices and gaseous discharges at atmospheric pressure. In this paper, we show that the implementation of the leap frog integration method in these circumstances can violate some of the assumptions central to the accuracy of this scheme. Indeed, without adaptation, the method gives incorrect results. We show here how the method must be modified to deal correctly with highly collisional cases.

  3. CCG: an integrative resource of cancer protein-coding genes and long noncoding RNAs.

    PubMed

    Liu, Mengrong; Yang, Yu-Cheng T; Xu, Gang; Tan, Chang; Lu, Zhi John

    2016-12-01

    The identification of cancer genes remains a main aim of cancer research. With the advances of high-throughput sequencing technologies, thousands of novel cancer genes were identified through recurrent mutation analyses and differential expression analyses between normal tissues and tumors in large populations. Many databases were developed to document the cancer genes. However, no public database providing both cancer protein-coding genes and cancer lncRNAs is available presently. Here, we present the Catalogue of Cancer Genes (CCG) database (http://ccg.xingene.net), a catalogue of cancer genes. It includes both well-supported and candidate cancer protein-coding genes and cancer lncRNAs collected from literature search and public databases. In addition, uniform genomic aberration information (such as somatic mutation and copy number variation) and drug-gene interactions were assigned to cancer genes in the database. CCG represents an effort on integrative assembly of well-supported and candidate cancer protein-coding and long noncoding RNA genes and takes advantages of high-throughput sequencing results on large populations. With the help of CCG, users can easily access a comprehensive list of cancer genes as well as genomic aberration related with these genes. The availability of integrative information will facilitate the understanding of cancer mechanisms. In addition, drug-gene information in CCG provides a useful guide to the development of new anti-cancer drugs and selection of rational combination therapies.

  4. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    SciTech Connect

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  5. Simulation of Supersonic Jet Noise with the Adaptation of Overflow CFD Code and Kirchhoff Surface Integral

    NASA Technical Reports Server (NTRS)

    Kandula, Max; Caimi, Raoul; Steinrock, T. (Technical Monitor)

    2001-01-01

    An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.

  6. Simulation of Jet Noise with OVERFLOW CFD Code and Kirchhoff Surface Integral

    NASA Technical Reports Server (NTRS)

    Kandula, M.; Caimi, R.; Voska, N. (Technical Monitor)

    2002-01-01

    An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.

  7. Comparison of different methods used in integral codes to model coagulation of aerosols

    NASA Astrophysics Data System (ADS)

    Beketov, A. I.; Sorokin, A. A.; Alipchenkov, V. M.; Mosunova, N. A.

    2013-09-01

    The methods for calculating coagulation of particles in the carrying phase that are used in the integral codes SOCRAT, ASTEC, and MELCOR, as well as the Hounslow and Jacobson methods used to model aerosol processes in the chemical industry and in atmospheric investigations are compared on test problems and against experimental results in terms of their effectiveness and accuracy. It is shown that all methods are characterized by a significant error in modeling the distribution function for micrometer particles if calculations are performed using rather "coarse" spectra of particle sizes, namely, when the ratio of the volumes of particles from neighboring fractions is equal to or greater than two. With reference to the problems considered, the Hounslow method and the method applied in the aerosol module used in the ASTEC code are the most efficient ones for carrying out calculations.

  8. IPACS (Integrated Probabilistic Assessment of Composite Structures): Code development and applications

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael C.

    1993-01-01

    A methodology and attendant computer code have been developed and are described to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, stress concentration factors, displacements, stress/strain etc., which are the consequences of the inherent uncertainties (scatter) in the primitive (independent random) variables (constituent, ply, laminate and structural) that describe the composite structures. The computer code, IPACS (Integrated Probabilistic Assessment of Composite Structures), can handle both composite mechanics and composite structures. Application to probabilistic composite mechanics is illustrated by its uses to evaluate the uncertainties in the major Poisson's ratio and in laminate stiffness and strength. IPACS application to probabilistic structural analysis is illustrated by its use to evaluate the uncertainties in the buckling of a composite plate, in the stress concentration factor in a composite panel and in the vertical displacement and ply stress in a composite aircraft wing segment.

  9. Extension of the Integrated Tiger Series (ITS) of electron-photon Monte Carlo codes to 100 GeV

    SciTech Connect

    Miller, S.G.

    1988-08-01

    Version 2.1 of the Integrated Tiger Series (ITS) of electron-photon Monte Carlo codes was modified to extend their ability to model interactions up to 100 GeV. Benchmarks against experimental results conducted at 10 and 15 GeV confirm the accuracy of the extended codes. 12 refs., 2 figs., 2 tabs.

  10. Integrated Tiger Series of electron/photon Monte Carlo transport codes: a user's guide for use on IBM mainframes

    SciTech Connect

    Kirk, B.L.

    1985-12-01

    The ITS (Integrated Tiger Series) Monte Carlo code package developed at Sandia National Laboratories and distributed as CCC-467/ITS by the Radiation Shielding Information Center (RSIC) at Oak Ridge National Laboratory (ORNL) consists of eight codes - the standard codes, TIGER, CYLTRAN, ACCEPT; the P-codes, TIGERP, CYLTRANP, ACCEPTP; and the M-codes ACCEPTM, CYLTRANM. The codes have been adapted to run on the IBM 3081, VAX 11/780, CDC-7600, and Cray 1 with the use of the update emulator UPEML. This manual should serve as a guide to a user running the codes on IBM computers having 370 architecture. The cases listed were tested on the IBM 3033, under the MVS operating system using the VS Fortran Level 1.3.1 compiler.

  11. A first-order integral method developed for the VARIANT code

    SciTech Connect

    Smith, M. A.; Lewis, E. E.; Palmiotti, G.; Yang, W. S.

    2006-07-01

    A first order nodal integral method using spherical harmonic interface conditions is formulated and implemented into VARIANT [1-3], a variational nodal transport code developed at Argonne National Laboratory. The spatial domain is split into hybrid finite elements, called nodes, where orthogonal polynomial spatial trial functions are used within each node and spatial Lagrange multipliers are used along the node boundaries. The internal angular approximation is weighted with a complete odd-order spherical harmonics set and numerically integrated using a standard angular quadrature. Along the nodal boundaries, even-order Rumyantsev interface conditions are combined with the spatial Lagrange multipliers to couple the nodes together. The new method is implemented in Cartesian geometry and used to solve a fixed source two-dimensional benchmark problem. (authors)

  12. Fitting Prompt Fission Neutron Spectra Using Kalman Filter Integrated with Empire Code

    NASA Astrophysics Data System (ADS)

    Nobre, G. P. A.; Herman, M.; Hoblit, S.; Palumbo, A.; Capote, R.; Trkov, A.

    2014-04-01

    Prompt fission neutron spectra (PFNS) have proven to have a significant effect on criticality of selected benchmarks, in some cases as important as cross-sections. Therefore, a precise determination of uncertainties in PFNS is desired. Existing PFNS evaluations in nuclear data libraries relied so far almost exclusively on the Los Alamos model. However, deviations of evaluated data from available experiments have been noticed at both low and high neutron emission energies. New experimental measurements of PFNS have been recently published, thus demanding new evaluations. The present work describes the effort of integrating Kalman and EMPIRE codes in such a way to allow for parameter fitting of PFNS models. The first results are shown for the major actinides for two different PFNS models (Kornilov and Los Alamos). This represents the first step towards reevaluation of both cross-section and fission spectra data considering both microscopic and integral experimental data for major actinides.

  13. An integrated, structure- and energy-based view of the genetic code

    PubMed Central

    Grosjean, Henri; Westhof, Eric

    2016-01-01

    The principles of mRNA decoding are conserved among all extant life forms. We present an integrative view of all the interaction networks between mRNA, tRNA and rRNA: the intrinsic stability of codon–anticodon duplex, the conformation of the anticodon hairpin, the presence of modified nucleotides, the occurrence of non-Watson–Crick pairs in the codon–anticodon helix and the interactions with bases of rRNA at the A-site decoding site. We derive a more information-rich, alternative representation of the genetic code, that is circular with an unsymmetrical distribution of codons leading to a clear segregation between GC-rich 4-codon boxes and AU-rich 2:2-codon and 3:1-codon boxes. All tRNA sequence variations can be visualized, within an internal structural and energy framework, for each organism, and each anticodon of the sense codons. The multiplicity and complexity of nucleotide modifications at positions 34 and 37 of the anticodon loop segregate meaningfully, and correlate well with the necessity to stabilize AU-rich codon–anticodon pairs and to avoid miscoding in split codon boxes. The evolution and expansion of the genetic code is viewed as being originally based on GC content with progressive introduction of A/U together with tRNA modifications. The representation we present should help the engineering of the genetic code to include non-natural amino acids. PMID:27448410

  14. An integral equation based computer code for high-gain free-electron lasers

    SciTech Connect

    Dejus, R.J.; Shevchenko, O.A.; Vinokurov, N.A.

    1998-09-01

    A computer code for gain optimization of high-gain free-electron lasers (FELs) is described. The electron motion is along precalculated period-averaged trajectories, and the finite-emittance electron beam is represented by a set of thin partial beams. The radiation field amplitudes are calculated at these thin beams only. The system of linear integral equations for these field amplitudes and the Fourier harmonics of the current of each thin beam is solved numerically. The code is aimed for design optimization of high-gain short-wavelength FELs with nonideal magnetic systems (breaks between undulators with quadrupoles and magnetic bunchers; field and steering errors). Both self-amplified spontaneous emission (SASE) and external input signal options can be treated. A typical run for a UV FEL, several gain lengths long, takes only one minute on a Pentium II personal computer (333 MHz) which makes it possible to run the code in optimization loops. Results for the Advanced Photon Source FEL project are presented.

  15. Embedded Systems Hardware Integration and Code Development for Maraia Capsule and E-MIST

    NASA Technical Reports Server (NTRS)

    Carretero, Emmanuel S.

    2015-01-01

    The cost of sending large spacecraft to orbit makes them undesirable for carrying out smaller scientific missions. Small spacecraft are more economical and can be tailored for missions where specific tasks need to be carried out, the Maraia capsule is such a spacecraft. Maraia will allow for samples of experiments conducted on the International Space Station to be returned to earth. The use of balloons to conduct experiments at the edge of space is a practical approach to reducing the large expense of using rockets. E-MIST is a payload designed to fly on a high altitude balloon. It can maintain science experiments in a controlled manner at the edge of space. The work covered here entails the integration of hardware onto each of the mentioned systems and the code associated with such work. In particular, the resistance temperature detector, pressure transducers, cameras, and thrusters for Maraia are discussed. The integration of the resistance temperature detectors and motor controllers to E-MIST is described. Several issues associated with sensor accuracy, code lock-up, and in-flight reset issues are mentioned. The solutions and proposed solutions to these issues are explained.

  16. Integrating environmental goals into urban redevelopment schemes: lessons from the Code River, Yogyakarta, Indonesia.

    PubMed

    Setiawan, B B

    2002-01-01

    The settlement along the bank of the Code River in Yogyakarta, Indonesia provides housing for a large mass of the city's poor. Its strategic location and the fact that most urban poor do not have access to land, attracts people to "illegally" settle along the bank of the river. This brings negative consequences for the environment, particularly the increasing domestic waste along the river and the annual flooding in the rainy season. While the public controversies regarding the existence of the settlement along the Code River were still not resolved, at the end of the 1980s, a group of architects, academics and community members proposed the idea of constructing a dike along the River as part of a broader settlement improvement program. From 1991 to 1998, thousands of local people mobilized their resources and were able to construct 6,000 metres of riverside dike along the Code River. The construction of the riverside dike along the River has become an important "stimulant" that generated not only settlement improvement, but also a better treatment of river water. As all housing units located along the River are now facing the River, the River itself is considered the "front-yard". Before the dike was constructed, the inhabitants used to treat the River as the "backyard" and therefore just throw waste into the River. They now really want to have a cleaner river, since the River is an important part of their settlement. The settlement along the Code River presents a complex range of persistent problems with informal settlements in Indonesia; such problems are related to the issues of how to provide more affordable and adequate housing for the poor, while at the same time, to improve the water quality of the river. The project represents a good case, which shows that through a mutual partnership among stakeholders, it is possible to integrate environmental goals into urban redevelopment schemes.

  17. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines

    PubMed Central

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology’s Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1–4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities. PMID:26192805

  18. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines.

    PubMed

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.

  19. Slow Temporal Integration Enables Robust Neural Coding and Perception of a Cue to Sound Source Location

    PubMed Central

    Tollin, Daniel J.

    2016-01-01

    In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. SIGNIFICANCE STATEMENT In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise

  20. Slow Temporal Integration Enables Robust Neural Coding and Perception of a Cue to Sound Source Location.

    PubMed

    Brown, Andrew D; Tollin, Daniel J

    2016-09-21

    In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise, degrading the fidelity of

  1. Segregated and integrated coding of reward and punishment in the cingulate cortex.

    PubMed

    Fujiwara, Juri; Tobler, Philippe N; Taira, Masato; Iijima, Toshio; Tsutsui, Ken-Ichiro

    2009-06-01

    Reward and punishment have opposite affective value but are both processed by the cingulate cortex. However, it is unclear whether the positive and negative affective values of monetary reward and punishment are processed by separate or common subregions of the cingulate cortex. We performed a functional magnetic resonance imaging study using a free-choice task and compared cingulate activations for different levels of monetary gain and loss. Gain-specific activation (increasing activation for increasing gain, but no activation change in relation to loss) occurred mainly in the anterior part of the anterior cingulate and in the posterior cingulate cortex. Conversely, loss-specific activation (increasing activation for increasing loss, but no activation change in relation to gain) occurred between these areas, in the middle and posterior part of the anterior cingulate. Integrated coding of gain and loss (increasing activation throughout the full range, from biggest loss to biggest gain) occurred in the dorsal part of the anterior cingulate, at the border with the medial prefrontal cortex. Finally, unspecific activation increases to both gains and losses (increasing activation to increasing gains and increasing losses, possibly reflecting attention) occurred in dorsal and middle regions of the cingulate cortex. Together, these results suggest separate and common coding of monetary reward and punishment in distinct subregions of the cingulate cortex. Further meta-analysis suggested that the presently found reward- and punishment-specific areas overlapped with those processing positive and negative emotions, respectively.

  2. Integrated modeling of H-mode tokamak discharges with ASTRA and B2SOLPS numerical codes

    NASA Astrophysics Data System (ADS)

    Senichenkov, I. Yu; Kaveeva, E. G.; Rozhansky, V. A.; Voskoboynikov, S. P.; Molchanov, P. A.; Coster, D. P.; Pereverzev, G. V.; the ASDEX Upgrade Team; the Globus-M Team

    2014-05-01

    The numerical codes ASTRA and B2SOLPS5.2 are coupled to perform an integrated modeling of particle and energy transport and to obtain continuous self-consistent profiles of the main plasma parameters from the magnetic axis up to target plates. The unique distinguishing feature of the new coupling scheme is the presence of a region of overlap of the 1D and 2D computational domains, where the 1D solution coincides with the 2D one at the equatorial midplane. In the 2D transport equation system, all relevant drift flows and currents are taken into account, which allows us to calculate the poloidal variation of the density, temperatures and electrostatic potential, and obtain neoclassical radial fluxes in a self-consistent manner. Such an approach allows us to model tokamaks for which neoclassical effects give a significant contribution to the ion heat transport, and in particular, spherical tokamaks.

  3. Simulation study of HL-2A-like plasma using integrated predictive modeling code

    SciTech Connect

    Poolyarat, N.; Onjun, T.; Promping, J.

    2009-11-15

    Self-consistent simulations of HL-2A-like plasma are carried out using 1.5D BALDUR integrated predictive modeling code. In these simulations, the core transport is predicted using the combination of Multi-mode (MMM95) anomalous core transport model and NCLASS neoclassical transport model. The evolution of plasma current, temperature and density is carried out. Consequently, the plasma current, temperature and density profiles, as well as other plasma parameters, are obtained as the predictions in each simulation. It is found that temperature and density profiles in these simulations are peak near the plasma center. In addition, the sawtooth period is studied using the Porcilli model and is found that before, during, and after the electron cyclotron resonance heating (ECRH) operation the sawtooth period are approximately the same. It is also observed that the mixing radius of sawtooth crashes is reduced during the ECRH operation.

  4. Integration of the olfactory code across dendritic claws of single mushroom body neurons

    PubMed Central

    Gruntman, Eyal; Turner, Glenn C.

    2013-01-01

    In the olfactory system, sensory inputs are arranged in different glomerular channels, which respond in combinatorial ensembles to the various chemical features of an odor. Here we investigate where and how this combinatorial code is read out deeper in the brain. We exploit the unique morphology of neurons in the mushroom body (MB), which receive input on large dendritic claws. Imaging odor responses of these dendritic claws shows that input channels with distinct odor tuning converge on individual MB neurons. We determined how these inputs interact to drive the cell to spike threshold using intracellular recordings to examine MB responses to optogenetically controlled input. Our results provide an elegant explanation for the characteristic selectivity of MB neurons: these cells receive different types of input, and require those inputs to be coactive in order to spike. These results establish the MB as an important site of integration in the fly olfactory system. PMID:24141312

  5. Integration Of SIMS Into A General Purpose IBA Data Analysis Code

    SciTech Connect

    Barradas, N. P.; Alves, E.; Alves, L. C.; Likonen, J.; Hakola, A.; Coad, P.; Widdowson, A.

    2011-06-01

    IBA techniques such as RBS, ERDA, NRA, or PIXE are highly complementary, and are often combined to maximize the extracted information. In particular, they have different sensitivities to various elements and probe different depth scales. The same is true for secondary ion mass spectrometry (SIMS), that can have much better detection limits for many species. Quantification of SIMS data normally requires careful calibration of the exact system being studied, and often the results are only semi-quantitative. Nevertheless, when SIMS is used together with other IBA techniques, it would be highly desirable to integrate the data analysis. We developed a routine to analyse SIMS data, and implemented it in NDF, a standard IBA data analysis code, that already supported RBS, ERDA, resonant and non-resonant NRA, and PIXE. Details of this new routine are presented in this work.

  6. A spectral approach integrating functional genomic annotations for coding and noncoding variants.

    PubMed

    Ionita-Laza, Iuliana; McCallum, Kenneth; Xu, Bin; Buxbaum, Joseph D

    2016-02-01

    Over the past few years, substantial effort has been put into the functional annotation of variation in human genome sequences. Such annotations can have a critical role in identifying putatively causal variants for a disease or trait among the abundant natural variation that occurs at a locus of interest. The main challenges in using these various annotations include their large numbers and their diversity. Here we develop an unsupervised approach to integrate these different annotations into one measure of functional importance (Eigen) that, unlike most existing methods, is not based on any labeled training data. We show that the resulting meta-score has better discriminatory ability using disease-associated and putatively benign variants from published studies (in both coding and noncoding regions) than the recently proposed CADD score. Across varied scenarios, the Eigen score performs generally better than any single individual annotation, representing a powerful single functional score that can be incorporated in fine-mapping studies.

  7. The Motivational Interviewing Treatment Integrity Code (MITI 4): Rationale, preliminary reliability and validity

    PubMed Central

    Rowell, L.N.; Manuel, Jennifer K.; Ernst, Denise; Houck, Jon M.

    2017-01-01

    The Motivational Interviewing Treatment Integrity Code has been revised to address new evidence-based elements of motivational interviewing (MI). This new version (MITI 4) includes new global ratings to assess clinician’s attention to client language, increased rigor in assessing autonomy support and client choice, and items to evaluate the use of persuasion when giving information and advice. Method: Four undergraduate, non-professional raters were trained in the MITI and used it to review 50 audiotapes of clinicians conducting MI in actual treatments sessions. Both kappa and intraclass correlation indices were calculated for all coders, for the best rater pair and for a 20% randomly selected sample from the best rater pair. Results: Reliability across raters, with the exception of Emphasize Autonomy and % Complex Reflections, were in the good to excellent range. Reliability estimates decrease when smaller samples are used and when fewer raters contribute. Conclusion: The advantages and drawbacks of this revision are discussed including implications for research and clinical applications. The MITI 4.0 represents a reliable method for assessing the integrity of MI including both the technical and relational components of the method. PMID:26874558

  8. Thought Insertion as a Self-Disturbance: An Integration of Predictive Coding and Phenomenological Approaches

    PubMed Central

    Sterzer, Philipp; Mishara, Aaron L.; Voss, Martin; Heinz, Andreas

    2016-01-01

    Current theories in the framework of hierarchical predictive coding propose that positive symptoms of schizophrenia, such as delusions and hallucinations, arise from an alteration in Bayesian inference, the term inference referring to a process by which learned predictions are used to infer probable causes of sensory data. However, for one particularly striking and frequent symptom of schizophrenia, thought insertion, no plausible account has been proposed in terms of the predictive-coding framework. Here we propose that thought insertion is due to an altered experience of thoughts as coming from “nowhere”, as is already indicated by the early 20th century phenomenological accounts by the early Heidelberg School of psychiatry. These accounts identified thought insertion as one of the self-disturbances (from German: “Ichstörungen”) of schizophrenia and used mescaline as a model-psychosis in healthy individuals to explore the possible mechanisms. The early Heidelberg School (Gruhle, Mayer-Gross, Beringer) first named and defined the self-disturbances, and proposed that thought insertion involves a disruption of the inner connectedness of thoughts and experiences, and a “becoming sensory” of those thoughts experienced as inserted. This account offers a novel way to integrate the phenomenology of thought insertion with the predictive coding framework. We argue that the altered experience of thoughts may be caused by a reduced precision of context-dependent predictions, relative to sensory precision. According to the principles of Bayesian inference, this reduced precision leads to increased prediction-error signals evoked by the neural activity that encodes thoughts. Thus, in analogy with the prediction-error related aberrant salience of external events that has been proposed previously, “internal” events such as thoughts (including volitions, emotions and memories) can also be associated with increased prediction-error signaling and are thus imbued

  9. iRegNet3D: three-dimensional integrated regulatory network for the genomic analysis of coding and non-coding disease mutations.

    PubMed

    Liang, Siqi; Tippens, Nathaniel D; Zhou, Yaoda; Mort, Matthew; Stenson, Peter D; Cooper, David N; Yu, Haiyuan

    2017-01-18

    The mechanistic details of most disease-causing mutations remain poorly explored within the context of regulatory networks. We present a high-resolution three-dimensional integrated regulatory network (iRegNet3D) in the form of a web tool, where we resolve the interfaces of all known transcription factor (TF)-TF, TF-DNA and chromatin-chromatin interactions for the analysis of both coding and non-coding disease-associated mutations to obtain mechanistic insights into their functional impact. Using iRegNet3D, we find that disease-associated mutations may perturb the regulatory network through diverse mechanisms including chromatin looping. iRegNet3D promises to be an indispensable tool in large-scale sequencing and disease association studies.

  10. Integrated Numerical Experiments (INEX) and the Free-Electron Laser Physical Process Code (FELPPC)

    SciTech Connect

    Thode, L.E.; Chan, K.C.D.; Schmitt, M.J.; McKee, J.; Ostic, J.; Elliott, C.J.; McVey, B.D.

    1990-01-01

    The strong coupling of subsystem elements, such as the accelerator, wiggler, and optics, greatly complicates the understanding and design of a free electron laser (FEL), even at the conceptual level. Given the requirements for high-performance FELs, the strong coupling between the laser subsystems must be included to obtain a realistic picture of the potential operational capability. To address the strong coupling character of the FEL the concept of an Integrated Numerical Experiment (INEX) was proposed. Unique features of the INEX approach are consistency and numerical equivalence of experimental diagnostics. The equivalent numerical diagnostics mitigates the major problem of misinterpretation that often occurs when theoretical and experimental data are compared. The INEX approach has been applied to a large number of accelerator and FEL experiments. Overall, the agreement between INEX and the experiments is very good. Despite the success of INEX, the approach is difficult to apply to trade-off and initial design studies because of the significant manpower and computational requirements. On the other hand, INEX provides a base from which realistic accelerator, wiggler, and optics models can be developed. The Free Electron Laser Physical Process Code (FELPPC) includes models developed from INEX, provides coupling between the subsystems models and incorporates application models relevant to a specific trade-off or design study.

  11. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  12. Integrated Numerical Experiments (INEX) and the Free-Electron Laser Physical Process Code (FELPPC)

    NASA Astrophysics Data System (ADS)

    Thode, L. E.; Chan, K. C. D.; Schmitt, M. J.; McKee, J.; Ostic, J.; Elliott, C. J.; McVey, B. D.

    The strong coupling of subsystem elements, such as the accelerator, wiggler, and optics, greatly complicates the understanding and design of a free electron laser (FEL), even at the conceptual level. Given the requirements for high-performance FELs, the strong coupling between the laser subsystems must be included to obtain a realistic picture of the potential operational capability. To address the strong coupling character of the FEL the concept of an Integrated Numerical Experiment (INEX) was proposed. Unique features of the INEX approach are consistency and numerical equivalence of experimental diagnostics. The equivalent numerical diagnostics mitigates the major problem of misinterpretation that often occurs when theoretical and experimental data are compared. The INEX approach has been applied to a large number of accelerator and FEL experiments. Overall, the agreement between INEX and the experiments is very good. Despite the success of INEX, the approach is difficult to apply to trade-off and initial design studies because of the significant manpower and computational requirements. On the other hand, INEX provides a base from which realistic accelerator, wiggler, and optics models can be developed. The Free Electron Laser Physical Process Code (FELPPC) includes models developed from INEX, provides coupling between the subsystems models and incorporates application models relevant to a specific trade-off or design study.

  13. Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic

    NASA Astrophysics Data System (ADS)

    Li, Yan; Dai, Shifang; Wu, Weiwei

    2016-12-01

    Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.

  14. Pinstripe: a suite of programs for integrating transcriptomic and proteomic datasets identifies novel proteins and improves differentiation of protein-coding and non-coding genes.

    PubMed

    Gascoigne, Dennis K; Cheetham, Seth W; Cattenoz, Pierre B; Clark, Michael B; Amaral, Paulo P; Taft, Ryan J; Wilhelm, Dagmar; Dinger, Marcel E; Mattick, John S

    2012-12-01

    Comparing transcriptomic data with proteomic data to identify protein-coding sequences is a long-standing challenge in molecular biology, one that is exacerbated by the increasing size of high-throughput datasets. To address this challenge, and thereby to improve the quality of genome annotation and understanding of genome biology, we have developed an integrated suite of programs, called Pinstripe. We demonstrate its application, utility and discovery power using transcriptomic and proteomic data from publicly available datasets. To demonstrate the efficacy of Pinstripe for large-scale analysis, we applied Pinstripe's reverse peptide mapping pipeline to a transcript library including de novo assembled transcriptomes from the human Illumina Body Atlas (IBA2) and GENCODE v10 gene annotations, and the EBI Proteomics Identifications Database (PRIDE) peptide database. This analysis identified 736 canonical open reading frames (ORFs) supported by three or more PRIDE peptide fragments that are positioned outside any known coding DNA sequence (CDS). Because of the unfiltered nature of the PRIDE database and high probability of false discovery, we further refined this list using independent evidence for translation, including the presence of a Kozak sequence or functional domains, synonymous/non-synonymous substitution ratios and ORF length. Using this integrative approach, we observed evidence of translation from a previously unknown let7e primary transcript, the archetypical lncRNA H19, and a homolog of RD3. Reciprocally, by exclusion of transcripts with mapped peptides or significant ORFs (>80 codon), we identify 32 187 loci with RNAs longer than 2000 nt that are unlikely to encode proteins. Pinstripe (pinstripe.matticklab.com) is freely available as source code or a Mono binary. Pinstripe is written in C# and runs under the Mono framework on Linux or Mac OS X, and both under Mono and .Net under Windows. m.dinger@garvan.org.au or j.mattick@garvan.org.au Supplementary

  15. An Integrated RELAP5-3D and Multiphase CFD Code System Utilizing a Semi Implicit Coupling Technique

    SciTech Connect

    D.L. Aumiller; E.T. Tomlinson; W.L. Weaver

    2001-06-21

    An integrated code system consisting of RELAP5-3D and a multiphase CFD program has been created through the use of a generic semi-implicit coupling algorithm. Unlike previous CFD coupling work, this coupling scheme is numerically stable provided the material Courant limit is not violated in RELAP5-3D or at the coupling locations. The basis for the coupling scheme and details regarding the unique features associated with the application of this technique to a four-field CFD program are presented. Finally, the results of a verification problem are presented. The coupled code system is shown to yield accurate and numerically stable results.

  16. Self-consistent modeling of DEMOs with 1.5D BALDUR integrated predictive modeling code

    NASA Astrophysics Data System (ADS)

    Wisitsorasak, A.; Somjinda, B.; Promping, J.; Onjun, T.

    2017-02-01

    Self-consistent simulations of four DEMO designs proposed by teams from China, Europe, India, and Korea are carried out using the BALDUR integrated predictive modeling code in which theory-based models are used, for both core transport and boundary conditions. In these simulations, a combination of the NCLASS neoclassical transport and multimode (MMM95) anomalous transport model is used to compute a core transport. The boundary is taken to be at the top of the pedestal, where the pedestal values are described using a pedestal temperature model based on a combination of magnetic and flow shear stabilization, pedestal width scaling and an infinite- n ballooning pressure gradient model and a pedestal density model based on a line average density. Even though an optimistic scenario is considered, the simulation results suggest that, with the exclusion of ELMs, the fusion gain Q obtained for these reactors is pessimistic compared to their original designs, i.e. 52% for the Chinese design, 63% for the European design, 22% for the Korean design, and 26% for the Indian design. In addition, the predicted bootstrap current fractions are also found to be lower than their original designs, as fractions of their original designs, i.e. 0.49 (China), 0.66 (Europe), and 0.58 (India). Furthermore, in relation to sensitivity, it is found that increasing values of the auxiliary heating power and the electron line average density from their design values yield an enhancement of fusion performance. In addition, inclusion of sawtooth oscillation effects demonstrate positive impacts on the plasma and fusion performance in European, Indian and Korean DEMOs, but degrade the performance in the Chinese DEMO.

  17. Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the

  18. XML-Based Generator of C++ Code for Integration With GUIs

    NASA Technical Reports Server (NTRS)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  19. Integration of QR codes into an anesthesia information management system for resident case log management.

    PubMed

    Avidan, Alexander; Weissman, Charles; Levin, Phillip D

    2015-04-01

    Quick response (QR) codes containing anesthesia syllabus data were introduced into an anesthesia information management system. The code was generated automatically at the conclusion of each case and available for resident case logging using a smartphone or tablet. The goal of this study was to evaluate the use and usability/user-friendliness of such system. Resident case logging practices were assessed prior to introducing the QR codes. QR code use and satisfactions amongst residents was reassessed at three and six months. Before QR code introduction only 12/23 (52.2%) residents maintained a case log. Most of the remaining residents (9/23, 39.1%) expected to receive a case list from the anesthesia information management system database at the end of their residency. At three months and six months 17/26 (65.4%) and 15/25 (60.0%) residents, respectively, were using the QR codes. Satisfaction was rated as very good or good. QR codes for residents' case logging with smartphones or tablets were successfully introduced in an anesthesia information management system and used by most residents. QR codes can be successfully implemented into medical practice to support data transfer. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Managing Widely Disparate Code Bases Through Automation of Continuous Integration and Deployment

    NASA Astrophysics Data System (ADS)

    McLaughlin, B. D.; Joshi, T.

    2013-12-01

    NASA EOSDIS tools, services, and service endpoints are widely dispersed across different sub-agencies and sub-organizations. Each of these entities has a different set of skills and widely varying codebases. Some produce sophisticated, well-tested, stable and deployable code, while others are struggling to meet stringent requirements with limited resources. This disparity makes the process of partnering with and deploying code onto the Earthdata platform (https://earthdata.nasa.gov) difficult, even at times impossible. The Earthdata Code Collaborative (ECC) is a project repository and code hosting facility that addresses this problem directly through a three-tiered approach: 1. Provide a standardized set of testing and automation tools for all hosted projects. 2. Regularly report on bugs and features as well as testing coverage and success through Web-based tools. 3. Directly pipeline projects from the ECC into the Earthdata production environment. This session will explain the architecture behind the ECC, including the custom software and 3rd party tools used. It will also detail the process by which decisions were and are being made to arrive at a fully-automated suite of tools and tests that allow any code base to quickly improve its quality and become a candidate for Earthdata inclusion. The session is oriented towards developers, managers, and team members involved in the process of developing, testing, deploying, and ensuring the quality of a code base, whether that code base be tens of millions of lines of code or simply hundreds.

  1. Comparison of experimental pulse-height distributions in germanium detectors with integrated-tiger-series-code predictions

    SciTech Connect

    Beutler, D.E.; Halbleib, J.A. ); Knott, D.P. )

    1989-12-01

    This paper reports pulse-height distributions in two different types of Ge detectors measured for a variety of medium-energy x-ray bremsstrahlung spectra. These measurements have been compared to predictions using the integrated tiger series (ITS) Monte Carlo electron/photon transport code. In general, the authors find excellent agreement between experiments and predictions using no free parameters. These results demonstrate that the ITS codes can predict the combined bremsstrahlung production and energy deposition with good precision (within measurement uncertainties). The one region of disagreement observed occurs for low-energy (<50 keV) photons using low-energy bremsstrahlung spectra. In this case the ITS codes appear to underestimate the produced and/or absorbed radiation by almost an order of magnitude.

  2. Impact of copy number variations burden on coding genome in humans using integrated high resolution arrays.

    PubMed

    Veerappa, Avinash M; Lingaiah, Kusuma; Vishweswaraiah, Sangeetha; Murthy, Megha N; Suresh, Raviraj V; Manjegowda, Dinesh S; Ramachandra, Nallur B

    2014-12-16

    Copy number variations (CNVs) alter the transcriptional and translational levels of genes by disrupting the coding structure and this burden of CNVs seems to be a significant contributor to phenotypic variations. Therefore it was necessary to assess the complexities of CNV burden on the coding genome. A total of 1715 individuals from 12 populations were used for CNV analysis in the present investigation. Analysis was performed using Affymetrix Genome-Wide Human SNP Array 6·0 chip and CytoScan High-Density arrays. CNVs were more frequently observed in the coding region than in the non-coding region. CNVs were observed vastly more frequently in the coding region than the non-coding region. CNVs were found to be enriched in the regions containing functional genes (83-96%) compared with the regions containing pseudogenes (4-17%). CNVs across the genome of an individual showed multiple hits across many genes, whose proteins interact physically and function under the same pathway. We identified varying numbers of proteins and degrees of interactions within protein complexes of single individual genomes. This study represents the first draft of a population-specific CNV genes map as well as a cross-populational map. The complex relationship of CNVs on genes and their physically interacting partners unravels many complexities involved in phenotype expression. This study identifies four mechanisms contributing to the complexities caused by the presence of multiple CNVs across many genes in the coding part of the genome.

  3. Context-dependent signal integration by the GLI code: the oncogenic load, pathways, modifiers and implications for cancer therapy.

    PubMed

    Aberger, Fritz; Ruiz I Altaba, Ariel

    2014-09-01

    Canonical Hedgehog (HH) signaling leads to the regulation of the GLI code: the sum of all positive and negative functions of all GLI proteins. In humans, the three GLI factors encode context-dependent activities with GLI1 being mostly an activator and GLI3 often a repressor. Modulation of GLI activity occurs at multiple levels, including by co-factors and by direct modification of GLI structure. Surprisingly, the GLI proteins, and thus the GLI code, is also regulated by multiple inputs beyond HH signaling. In normal development and homeostasis these include a multitude of signaling pathways that regulate proto-oncogenes, which boost positive GLI function, as well as tumor suppressors, which restrict positive GLI activity. In cancer, the acquisition of oncogenic mutations and the loss of tumor suppressors - the oncogenic load - regulates the GLI code toward progressively more activating states. The fine and reversible balance of GLI activating GLI(A) and GLI repressing GLI(R) states is lost in cancer. Here, the acquisition of GLI(A) levels above a given threshold is predicted to lead to advanced malignant stages. In this review we highlight the concepts of the GLI code, the oncogenic load, the context-dependency of GLI action, and different modes of signaling integration such as that of HH and EGF. Targeting the GLI code directly or indirectly promises therapeutic benefits beyond the direct blockade of individual pathways.

  4. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  5. Use of an Integrated Discrete Fracture Network Code for Stochastic Stability Analyses of Fractured Rock Masses

    NASA Astrophysics Data System (ADS)

    Merrien-Soukatchoff, V.; Korini, T.; Thoraval, A.

    2012-03-01

    The paper presents the Discrete Fracture Network code RESOBLOK, which couples geometrical block system construction and a quick iterative stability analysis in the same package. The deterministic or stochastic geometry of a fractured rock mass can be represented and interactively displayed in 3D using two different fracture generators: one mainly used for hydraulic purposes and another designed to allow block stability evaluation. RESOBLOK has downstream modules that can quickly compute stability (based on limit equilibrium or energy-based analysis), display geometric information and create links to other discrete software. The advantage of the code is that it couples stochastic geometrical representation and a quick iterative stability analysis to allow risk-analysis with or without reinforcement and, for the worst cases, more accurate analysis using stress-strain analysis computer codes. These different aspects are detailed for embankment and underground works.

  6. Creation of fully vectorized FORTRAN code for integrating the movement of dust grains in interplanetary environments

    NASA Technical Reports Server (NTRS)

    Colquitt, Walter

    1989-01-01

    The main objective is to improve the performance of a specific FORTRAN computer code from the Planetary Sciences Division of NASA/Johnson Space Center when used on a modern vectorizing supercomputer. The code is used to calculate orbits of dust grains that separate from comets and asteroids. This code accounts for influences of the sun and 8 planets (neglecting Pluto), solar wind, and solar light pressure including Poynting-Robertson drag. Calculations allow one to study the motion of these particles as they are influenced by the Earth or one of the other planets. Some of these particles become trapped just beyond the Earth for long periods of time. These integer period resonances vary from 3 orbits of the Earth and 2 orbits of the particles to as high as 14 to 13.

  7. Generation of 238U Covariance Matrices by Using the Integral Data Assimilation Technique of the CONRAD Code

    NASA Astrophysics Data System (ADS)

    Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.

    2016-02-01

    A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.

  8. Design of time-pulse coded optoelectronic neuronal elements for nonlinear transformation and integration

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2008-03-01

    In the paper the actuality of neurophysiologically motivated neuron arrays with flexibly programmable functions and operations with possibility to select required accuracy and type of nonlinear transformation and learning are shown. We consider neurons design and simulation results of multichannel spatio-time algebraic accumulation - integration of optical signals. Advantages for nonlinear transformation and summation - integration are shown. The offered circuits are simple and can have intellectual properties such as learning and adaptation. The integrator-neuron is based on CMOS current mirrors and comparators. The performance: consumable power - 100...500 μW, signal period- 0.1...1ms, input optical signals power - 0.2...20 μW time delays - less 1μs, the number of optical signals - 2...10, integration time - 10...100 of signal periods, accuracy or integration error - about 1%. Various modifications of the neuron-integrators with improved performance and for different applications are considered in the paper.

  9. Semantic Integration and Age of Acquisition Effects in Code-Blend Comprehension

    ERIC Educational Resources Information Center

    Giezen, Marcel R.; Emmorey, Karen

    2016-01-01

    Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust…

  10. Semantic Integration and Age of Acquisition Effects in Code-Blend Comprehension

    ERIC Educational Resources Information Center

    Giezen, Marcel R.; Emmorey, Karen

    2016-01-01

    Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust…

  11. Integrated genome analysis suggests that most conserved non-coding sequences are regulatory factor binding sites

    PubMed Central

    Hemberg, Martin; Gray, Jesse M.; Cloonan, Nicole; Kuersten, Scott; Grimmond, Sean; Greenberg, Michael E.; Kreiman, Gabriel

    2012-01-01

    More than 98% of a typical vertebrate genome does not code for proteins. Although non-coding regions are sprinkled with short (<200 bp) islands of evolutionarily conserved sequences, the function of most of these unannotated conserved islands remains unknown. One possibility is that unannotated conserved islands could encode non-coding RNAs (ncRNAs); alternatively, unannotated conserved islands could serve as promoter-distal regulatory factor binding sites (RFBSs) like enhancers. Here we assess these possibilities by comparing unannotated conserved islands in the human and mouse genomes to transcribed regions and to RFBSs, relying on a detailed case study of one human and one mouse cell type. We define transcribed regions by applying a novel transcript-calling algorithm to RNA-Seq data obtained from total cellular RNA, and we define RFBSs using ChIP-Seq and DNAse-hypersensitivity assays. We find that unannotated conserved islands are four times more likely to coincide with RFBSs than with unannotated ncRNAs. Thousands of conserved RFBSs can be categorized as insulators based on the presence of CTCF or as enhancers based on the presence of p300/CBP and H3K4me1. While many unannotated conserved RFBSs are transcriptionally active to some extent, the transcripts produced tend to be unspliced, non-polyadenylated and expressed at levels 10 to 100-fold lower than annotated coding or ncRNAs. Extending these findings across multiple cell types and tissues, we propose that most conserved non-coding genomic DNA in vertebrate genomes corresponds to promoter-distal regulatory elements. PMID:22684627

  12. Integrative Analysis of Normal Long Intergenic Non-Coding RNAs in Prostate Cancer

    PubMed Central

    Bawa, Pushpinder; Zackaria, Sajna; Verma, Mohit; Gupta, Saurabh; Srivatsan, R; Chaudhary, Bibha; Srinivasan, Subhashini

    2015-01-01

    Recently, large numbers of normal human tissues have been profiled for non-coding RNAs and more than fourteen thousand long intergenic non-coding RNAs (lincRNAs) are found expressed in normal human tissues. The functional roles of these normal lincRNAs (nlincRNAs) in the regulation of protein coding genes in normal and disease biology are yet to be established. Here, we have profiled two RNA-seq datasets including cancer and matched non-neoplastic tissues from 12 individuals from diverse demography for both coding genes and nlincRNAs. We find 130 nlincRNAs significantly regulated in cancer, with 127 regulated in the same direction in the two datasets. Interestingly, according to Illumina Body Map, significant numbers of these nlincRNAs display baseline null expression in normal prostate tissues but are specific to other tissues such as thyroid, kidney, liver and testis. A number of the regulated nlincRNAs share loci with coding genes, which are either co-regulated or oppositely regulated in all cancer samples studied here. For example, in all cancer samples i) the nlincRNA, TCONS_00029157, and a neighboring tumor suppressor factor, SIK1, are both down regulated; ii) several thyroid-specific nlincRNAs in the neighborhood of the thyroid-specific gene TPO, are both up-regulated; and iii) the TCONS_00010581, an isoform of HEIH, is down-regulated while the neighboring EZH2 gene is up-regulated in cancer. Several nlincRNAs from a prostate cancer associated chromosomal locus, 8q24, are up-regulated in cancer along with other known prostate cancer associated genes including PCAT-1, PVT1, and PCAT-92. We observe that there is significant bias towards up-regulation of nlincRNAs with as high as 118 out of 127 up-regulated in cancer, even though regulation of coding genes is skewed towards down-regulation. Considering that all reported cancer associated lincRNAs (clincRNAs) are biased towards up-regulation, we conclude that this bias may be functionally relevant. PMID:25933431

  13. Validating a Monotonically-Integrated Large Eddy Simulation Code for Subsonic Jet Acoustics

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Bridges, James

    2017-01-01

    The results of subsonic jet validation cases for the Naval Research Lab's Jet Engine Noise REduction (JENRE) code are reported. Two set points from the Tanna matrix, set point 3 (Ma = 0.5, unheated) and set point 7 (Ma = 0.9, unheated) are attempted on three different meshes. After a brief discussion of the JENRE code and the meshes constructed for this work, the turbulent statistics for the axial velocity are presented and compared to experimental data, with favorable results. Preliminary simulations for set point 23 (Ma = 0.5, Tj=T1 = 1.764) on one of the meshes are also described. Finally, the proposed configuration for the farfield noise prediction with JENRE's Ffowcs-Williams Hawking solver are detailed.

  14. Correlation and synchrony transfer in integrate-and-fire neurons: basic properties and consequences for coding.

    PubMed

    Shea-Brown, Eric; Josić, Kresimir; de la Rocha, Jaime; Doiron, Brent

    2008-03-14

    We study how pairs of neurons transfer correlated input currents into correlated spikes. Over rapid time scales, correlation transfer increases with both spike time variability and rate; the dependence on variability disappears at large time scales. This persists for a nonlinear membrane model and for heterogeneous cell pairs, but strong nonmonotonicities follow from refractory effects. We present consequences for population coding and for the encoding of time-varying stimuli.

  15. NCAD, a database integrating the intrinsic conformational preferences of non-coded amino acids

    PubMed Central

    Revilla-López, Guillem; Torras, Juan; Curcó, David; Casanovas, Jordi; Calaza, M. Isabel; Zanuy, David; Jiménez, Ana I.; Cativiela, Carlos; Nussinov, Ruth; Grodzinski, Piotr; Alemán, Carlos

    2010-01-01

    Peptides and proteins find an ever-increasing number of applications in the biomedical and materials engineering fields. The use of non-proteinogenic amino acids endowed with diverse physicochemical and structural features opens the possibility to design proteins and peptides with novel properties and functions. Moreover, non-proteinogenic residues are particularly useful to control the three-dimensional arrangement of peptidic chains, which is a crucial issue for most applications. However, information regarding such amino acids –also called non-coded, non-canonical or non-standard– is usually scattered among publications specialized in quite diverse fields as well as in patents. Making all these data useful to the scientific community requires new tools and a framework for their assembly and coherent organization. We have successfully compiled, organized and built a database (NCAD, Non-Coded Amino acids Database) containing information about the intrinsic conformational preferences of non-proteinogenic residues determined by quantum mechanical calculations, as well as bibliographic information about their synthesis, physical and spectroscopic characterization, conformational propensities established experimentally, and applications. The architecture of the database is presented in this work together with the first family of non-coded residues included, namely, α-tetrasubstituted α-amino acids. Furthermore, the NCAD usefulness is demonstrated through a test-case application example. PMID:20455555

  16. Orbitofrontal Cortex Signals Expected Outcomes with Predictive Codes When Stable Contingencies Promote the Integration of Reward History.

    PubMed

    Riceberg, Justin S; Shapiro, Matthew L

    2017-02-22

    Memory can inform goal-directed behavior by linking current opportunities to past outcomes. The orbitofrontal cortex (OFC) may guide value-based responses by integrating the history of stimulus-reward associations into expected outcomes, representations of predicted hedonic value and quality. Alternatively, the OFC may rapidly compute flexible "online" reward predictions by associating stimuli with the latest outcome. OFC neurons develop predictive codes when rats learn to associate arbitrary stimuli with outcomes, but the extent to which predictive coding depends on most recent events and the integrated history of rewards is unclear. To investigate how reward history modulates OFC activity, we recorded OFC ensembles as rats performed spatial discriminations that differed only in the number of rewarded trials between goal reversals. The firing rate of single OFC neurons distinguished identical behaviors guided by different goals. When >20 rewarded trials separated goal switches, OFC ensembles developed stable and anticorrelated population vectors that predicted overall choice accuracy and the goal selected in single trials. When <10 rewarded trials separated goal switches, OFC population vectors decorrelated rapidly after each switch, but did not develop anticorrelated firing patterns or predict choice accuracy. The results show that, whereas OFC signals respond rapidly to contingency changes, they predict choices only when reward history is relatively stable, suggesting that consecutive rewarded episodes are needed for OFC computations that integrate reward history into expected outcomes.SIGNIFICANCE STATEMENT Adapting to changing contingencies and making decisions engages the orbitofrontal cortex (OFC). Previous work shows that OFC function can either improve or impair learning depending on reward stability, suggesting that OFC guides behavior optimally when contingencies apply consistently. The mechanisms that link reward history to OFC computations remain

  17. Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)

    2002-01-01

    SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.

  18. Integrating industry nuclear codes and standards into United States Department of Energy facilities

    SciTech Connect

    Jacox, J.

    1995-02-01

    Recently the United States Department of Energy (DOE) has mandated facilities under their jurisdiction use various industry Codes and Standards developed for civilian power reactors that operate under U.S. Nuclear Regulatory Commission License. While this is a major step forward in putting all our nuclear facilities under common technical standards there are always problems associated with implementing such advances. This paper will discuss some of the advantages and problems experienced to date. These include the universal challenge of educating new users of any technical documents, repeating errors made by the NRC licensed facilities over the years and some unique problems specific to DOE facilities.

  19. Are interaural time and level differences represented by independent or integrated codes in the human auditory cortex?

    PubMed

    Edmonds, Barrie A; Krumbholz, Katrin

    2014-02-01

    Sound localization is important for orienting and focusing attention and for segregating sounds from different sources in the environment. In humans, horizontal sound localization mainly relies on interaural differences in sound arrival time and sound level. Despite their perceptual importance, the neural processing of interaural time and level differences (ITDs and ILDs) remains poorly understood. Animal studies suggest that, in the brainstem, ITDs and ILDs are processed independently by different specialized circuits. The aim of the current study was to investigate whether, at higher processing levels, they remain independent or are integrated into a common code of sound laterality. For that, we measured late auditory cortical potentials in response to changes in sound lateralization elicited by perceptually matched changes in ITD and/or ILD. The responses to the ITD and ILD changes exhibited significant morphological differences. At the same time, however, they originated from overlapping areas of the cortex and showed clear evidence for functional coupling. These results suggest that the auditory cortex contains an integrated code of sound laterality, but also retains independent information about ITD and ILD cues. This cue-related information might be used to assess how consistent the cues are, and thus, how likely they would have arisen from the same source.

  20. LORD: a phenotype-genotype semantically integrated biomedical data tool to support rare disease diagnosis coding in health information systems

    PubMed Central

    Choquet, Remy; Maaroufi, Meriem; Fonjallaz, Yannick; de Carrara, Albane; Vandenbussche, Pierre-Yves; Dhombres, Ferdinand; Landais, Paul

    2015-01-01

    Characterizing a rare disease diagnosis for a given patient is often made through expert’s networks. It is a complex task that could evolve over time depending on the natural history of the disease and the evolution of the scientific knowledge. Most rare diseases have genetic causes and recent improvements of sequencing techniques contribute to the discovery of many new diseases every year. Diagnosis coding in the rare disease field requires data from multiple knowledge bases to be aggregated in order to offer the clinician a global information space from possible diagnosis to clinical signs (phenotypes) and known genetic mutations (genotype). Nowadays, the major barrier to the coding activity is the lack of consolidation of such information scattered in different thesaurus such as Orphanet, OMIM or HPO. The Linking Open data for Rare Diseases (LORD) web portal we developed stands as the first attempt to fill this gap by offering an integrated view of 8,400 rare diseases linked to more than 14,500 signs and 3,270 genes. The application provides a browsing feature to navigate through the relationships between diseases, signs and genes, and some Application Programming Interfaces to help its integration in health information systems in routine. PMID:26958175

  1. LORD: a phenotype-genotype semantically integrated biomedical data tool to support rare disease diagnosis coding in health information systems.

    PubMed

    Choquet, Remy; Maaroufi, Meriem; Fonjallaz, Yannick; de Carrara, Albane; Vandenbussche, Pierre-Yves; Dhombres, Ferdinand; Landais, Paul

    Characterizing a rare disease diagnosis for a given patient is often made through expert's networks. It is a complex task that could evolve over time depending on the natural history of the disease and the evolution of the scientific knowledge. Most rare diseases have genetic causes and recent improvements of sequencing techniques contribute to the discovery of many new diseases every year. Diagnosis coding in the rare disease field requires data from multiple knowledge bases to be aggregated in order to offer the clinician a global information space from possible diagnosis to clinical signs (phenotypes) and known genetic mutations (genotype). Nowadays, the major barrier to the coding activity is the lack of consolidation of such information scattered in different thesaurus such as Orphanet, OMIM or HPO. The Linking Open data for Rare Diseases (LORD) web portal we developed stands as the first attempt to fill this gap by offering an integrated view of 8,400 rare diseases linked to more than 14,500 signs and 3,270 genes. The application provides a browsing feature to navigate through the relationships between diseases, signs and genes, and some Application Programming Interfaces to help its integration in health information systems in routine.

  2. Integration of Expressed Sequence Tag Data Flanking Predicted RNA Secondary Structures Facilitates Novel Non-Coding RNA Discovery

    PubMed Central

    Krzyzanowski, Paul M.; Price, Feodor D.; Muro, Enrique M.; Rudnicki, Michael A.; Andrade-Navarro, Miguel A.

    2011-01-01

    Many computational methods have been used to predict novel non-coding RNAs (ncRNAs), but none, to our knowledge, have explicitly investigated the impact of integrating existing cDNA-based Expressed Sequence Tag (EST) data that flank structural RNA predictions. To determine whether flanking EST data can assist in microRNA (miRNA) prediction, we identified genomic sites encoding putative miRNAs by combining functional RNA predictions with flanking ESTs data in a model consistent with miRNAs undergoing cleavage during maturation. In both human and mouse genomes, we observed that the inclusion of flanking ESTs adjacent to and not overlapping predicted miRNAs significantly improved the performance of various methods of miRNA prediction, including direct high-throughput sequencing of small RNA libraries. We analyzed the expression of hundreds of miRNAs predicted to be expressed during myogenic differentiation using a customized microarray and identified several known and predicted myogenic miRNA hairpins. Our results indicate that integrating ESTs flanking structural RNA predictions improves the quality of cleaved miRNA predictions and suggest that this strategy can be used to predict other non-coding RNAs undergoing cleavage during maturation. PMID:21698286

  3. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  4. On the integration of equations of motion for particle-in-cell codes

    SciTech Connect

    Fuchs, V.

    2006-05-01

    An area-preserving implementation of the 2nd order Runge-Kutta integration method for equations of motion is presented. For forces independent of velocity the scheme possesses the same numerical simplicity and stability as the leapfrog method, and is not implicit for forces which do depend on velocity. It can be therefore easily applied where the leapfrog method in general cannot. We discuss the stability of the new scheme and test its performance in calculations of particle motion in three cases of interest. First, in the ubiquitous and numerically demanding example of nonlinear interaction of particles with a propagating plane wave, second, in the case of particle motion in a static magnetic field and, third, in a nonlinear dissipative case leading to a limit cycle. We compare computed orbits with exact orbits and with results from the leapfrog and other low-order integration schemes. Of special interest is the role of intrinsic stochasticity introduced by time differencing, which can destroy orbits of an otherwise exactly integrable system and therefore constitutes a restriction on the applicability of an integration scheme in such a context [A. Friedman, S.P. Auerbach, J. Comput. Phys. 93 (1991) 171]. In particular, we show that for a plane wave the new scheme proposed herein can be reduced to a symmetric standard map. This leads to the nonlinear stability condition {delta}t {omega} {sub B} {<=} 1, where {delta}t is the time step and {omega} {sub B} the particle bounce frequency.

  5. Modeling development of natural multi-sensory integration using neural self-organisation and probabilistic population codes

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan

    2015-10-01

    Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.

  6. Hybrid information privacy system: integration of chaotic neural network and RSA coding

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Willey, Jeff; Lee, Ting N.; Szu, Harold H.

    2005-03-01

    Electronic mails are adopted worldwide; most are easily hacked by hackers. In this paper, we purposed a free, fast and convenient hybrid privacy system to protect email communication. The privacy system is implemented by combining private security RSA algorithm with specific chaos neural network encryption process. The receiver can decrypt received email as long as it can reproduce the specified chaos neural network series, so called spatial-temporal keys. The chaotic typing and initial seed value of chaos neural network series, encrypted by the RSA algorithm, can reproduce spatial-temporal keys. The encrypted chaotic typing and initial seed value are hidden in watermark mixed nonlinearly with message media, wrapped with convolution error correction codes for wireless 3rd generation cellular phones. The message media can be an arbitrary image. The pattern noise has to be considered during transmission and it could affect/change the spatial-temporal keys. Since any change/modification on chaotic typing or initial seed value of chaos neural network series is not acceptable, the RSA codec system must be robust and fault-tolerant via wireless channel. The robust and fault-tolerant properties of chaos neural networks (CNN) were proved by a field theory of Associative Memory by Szu in 1997. The 1-D chaos generating nodes from the logistic map having arbitrarily negative slope a = p/q generating the N-shaped sigmoid was given first by Szu in 1992. In this paper, we simulated the robust and fault-tolerance properties of CNN under additive noise and pattern noise. We also implement a private version of RSA coding and chaos encryption process on messages.

  7. INTEGRATING CLINICAL LABORATORY MEASURES AND ICD-9 CODE DIAGNOSES IN PHENOME-WIDE ASSOCIATION STUDIES

    PubMed Central

    Verma, Anurag; Leader, Joseph B.; Verma, Shefali S.; Frase, Alex; Wallace, John; Dudek, Scott; Lavage, Daniel R.; Van Hout, Cristopher V.; Dewey, Frederick E.; Penn, John; Lopez, Alex; Overton, John D.; Carey, David J.; Ledbetter, David H.; Kirchner, H. Lester; Ritchie, Marylyn D.; Pendergrass, Sarah A.

    2016-01-01

    Electronic health records (EHR) provide a comprehensive resource for discovery, allowing unprecedented exploration of the impact of genetic architecture on health and disease. The data of EHRs also allow for exploration of the complex interactions between health measures across health and disease. The discoveries arising from EHR based research provide important information for the identification of genetic variation for clinical decision-making. Due to the breadth of information collected within the EHR, a challenge for discovery using EHR based data is the development of high-throughput tools that expose important areas of further research, from genetic variants to phenotypes. Phenome-Wide Association studies (PheWAS) provide a way to explore the association between genetic variants and comprehensive phenotypic measurements, generating new hypotheses and also exposing the complex relationships between genetic architecture and outcomes, including pleiotropy. EHR based PheWAS have mainly evaluated associations with case/control status from International Classification of Disease, Ninth Edition (ICD-9) codes. While these studies have highlighted discovery through PheWAS, the rich resource of clinical lab measures collected within the EHR can be better utilized for high-throughput PheWAS analyses and discovery. To better use these resources and enrich PheWAS association results we have developed a sound methodology for extracting a wide range of clinical lab measures from EHR data. We have extracted a first set of 21 clinical lab measures from the de-identified EHR of participants of the Geisinger MyCode™ biorepository, and calculated the median of these lab measures for 12,039 subjects. Next we evaluated the association between these 21 clinical lab median values and 635,525 genetic variants, performing a genome-wide association study (GWAS) for each of 21 clinical lab measures. We then calculated the association between SNPs from these GWAS passing our Bonferroni

  8. Integral and Separate Effects Tests for Thermal Hydraulics Code Validation for Liquid-Salt Cooled Nuclear Reactors

    SciTech Connect

    Peterson, Per

    2012-10-30

    The objective of the 3-year project was to collect integral effects test (IET) data to validate the RELAP5-3D code and other thermal hydraulics codes for use in predicting the transient thermal hydraulics response of liquid salt cooled reactor systems, including integral transient response for forced and natural circulation operation. The reference system for the project is a modular, 900-MWth Pebble Bed Advanced High Temperature Reactor (PB-AHTR), a specific type of Fluoride salt-cooled High temperature Reactor (FHR). Two experimental facilities were developed for thermal-hydraulic integral effects tests (IETs) and separate effects tests (SETs). The facilities use simulant fluids for the liquid fluoride salts, with very little distortion to the heat transfer and fluid dynamics behavior. The CIET Test Bay facility was designed, built, and operated. IET data for steady state and transient natural circulation was collected. SET data for convective heat transfer in pebble beds and straight channel geometries was collected. The facility continues to be operational and will be used for future experiments, and for component development. The CIET 2 facility is larger in scope, and its construction and operation has a longer timeline than the duration of this grant. The design for the CIET 2 facility has drawn heavily on the experience and data collected on the CIET Test Bay, and it was completed in parallel with operation of the CIET Test Bay. CIET 2 will demonstrate start-up and shut-down transients and control logic, in addition to LOFC and LOHS transients, and buoyant shut down rod operation during transients. Design of the CIET 2 Facility is complete, and engineering drawings have been submitted to an external vendor for outsourced quality controlled construction. CIET 2 construction and operation continue under another NEUP grant. IET data from both CIET facilities is to be used for validation of system codes used for FHR modeling, such as RELAP5-3D. A set of

  9. An integrated PCR colony hybridization approach to screen cDNA libraries for full-length coding sequences.

    PubMed

    Pollier, Jacob; González-Guzmán, Miguel; Ardiles-Diaz, Wilson; Geelen, Danny; Goossens, Alain

    2011-01-01

    cDNA-Amplified Fragment Length Polymorphism (cDNA-AFLP) is a commonly used technique for genome-wide expression analysis that does not require prior sequence knowledge. Typically, quantitative expression data and sequence information are obtained for a large number of differentially expressed gene tags. However, most of the gene tags do not correspond to full-length (FL) coding sequences, which is a prerequisite for subsequent functional analysis. A medium-throughput screening strategy, based on integration of polymerase chain reaction (PCR) and colony hybridization, was developed that allows in parallel screening of a cDNA library for FL clones corresponding to incomplete cDNAs. The method was applied to screen for the FL open reading frames of a selection of 163 cDNA-AFLP tags from three different medicinal plants, leading to the identification of 109 (67%) FL clones. Furthermore, the protocol allows for the use of multiple probes in a single hybridization event, thus significantly increasing the throughput when screening for rare transcripts. The presented strategy offers an efficient method for the conversion of incomplete expressed sequence tags (ESTs), such as cDNA-AFLP tags, to FL-coding sequences.

  10. A linear integral-equation-based computer code for self-amplified spontaneous emission calculations of free-electron lasers.

    SciTech Connect

    Dejus, R. J.; Shevchenko, O. A.; Vinokurov, A.

    1999-09-16

    The linear integral-equation-based computer code RON (Roger Oleg Nikolai), which was recently developed at Argonne National Laboratory, was used to calculate the self-amplified spontaneous emission (SASE) performance of the free-electron laser (FEL) being built at Argonne. Signal growth calculations under different conditions were used to estimate tolerances of actual design parameters and to estimate optimal length of the break sections between undulator segments. Explicit calculation of the radiation field was added recently and a typical angular distribution in the break section is shown. The measured magnetic fields of five undulators were used to calculate the gain for the Argonne FEL. The result indicates that the real undulators for the Argonne FEL (the effect of magnetic field errors alone) will not significantly degrade the FEL performance. The capability to calculate the small-signal gain for an FEL-oscillator is also demonstrated.

  11. Integration of the DRAGON5/DONJON5 codes in the SALOME platform for performing multi-physics calculations in nuclear engineering

    NASA Astrophysics Data System (ADS)

    Hébert, Alain

    2014-06-01

    We are presenting the computer science techniques involved in the integration of codes DRAGON5 and DONJON5 in the SALOME platform. This integration brings new capabilities in designing multi-physics computational schemes, with the possibility to couple our reactor physics codes with thermal-hydraulics or thermo-mechanics codes from other organizations. A demonstration is presented where two code components are coupled using the YACS module of SALOME, based on the CORBA protocol. The first component is a full-core 3D steady-state neuronic calculation in a PWR performed using DONJON5. The second component implement a set of 1D thermal-hydraulics calculations, each performed over a single assembly.

  12. Solar/hydro integration study. Technical progress report, February-July 1980. [STORMRK code

    SciTech Connect

    Not Available

    1980-01-01

    The Water and Power Resources Service in cooperation with the Department of Energy (DOE) is investigating the technical and economic feasibility of integrating solar central receiver powerplants with the Federal hydroelectric power system in the southwest United States. The principal hydro facility in this region is Hoover Dam. It is located on the Colorado River with Lake Mead on the upstream side and Lake Mohave on the downstream side. The central receiver was selected for this application because DOE has identified it as the most economically feasible design for large power systems, i.e., 100-MWe systems or larger. Typical meteorological year (TMY) data were obtained for Las Vegas from the Solar Energy Research Institute. Plots of available solar energy at Yuma and Mormon Mesa are presented for several operational threshold levels. The data show that a solar plant's operational time can be reduced by 20% and still utilize more than 97% of the available solar energy. The Mormon Mesa site has slightly more solar energy available than the Yuma site. A meteorological surface observation network (MESONET) weather station is being prepared for installation at the Yuma site. The MESONET station which normally measures temperature, relative humidity, barometric pressure, wind speed, and wind direction will be retrofitted to measure direct beam and global radiation. The radiation data will be used in dynamic simulations of solar power systems. (WHK)

  13. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  14. Genomic integration of the full-length dystrophin coding sequence in Duchenne muscular dystrophy induced pluripotent stem cells.

    PubMed

    Farruggio, Alfonso P; Bhakta, Mital S; du Bois, Haley; Ma, Julia; P Calos, Michele

    2017-04-01

    The plasmid vectors that express the full-length human dystrophin coding sequence in human cells was developed. Dystrophin, the protein mutated in Duchenne muscular dystrophy, is extraordinarily large, providing challenges for cloning and plasmid production in Escherichia coli. The authors expressed dystrophin from the strong, widely expressed CAG promoter, along with co-transcribed luciferase and mCherry marker genes useful for tracking plasmid expression. Introns were added at the 3' and 5' ends of the dystrophin sequence to prevent translation in E. coli, resulting in improved plasmid yield. Stability and yield were further improved by employing a lower-copy number plasmid origin of replication. The dystrophin plasmids also carried an attB site recognized by phage phiC31 integrase, enabling the plasmids to be integrated into the human genome at preferred locations by phiC31 integrase. The authors demonstrated single-copy integration of plasmid DNA into the genome and production of human dystrophin in the human 293 cell line, as well as in induced pluripotent stem cells derived from a patient with Duchenne muscular dystrophy. Plasmid-mediated dystrophin expression was also demonstrated in mouse muscle. The dystrophin expression plasmids described here will be useful in cell and gene therapy studies aimed at ameliorating Duchenne muscular dystrophy.

  15. Reactor Pressure Vessel Integrity Assessments with the Grizzly Aging Simulation Code

    SciTech Connect

    Spencer, Benjamin; Backman, Marie; Hoffman, William; Chakraborty, Pritam

    2015-08-01

    Grizzly is a simulation tool being developed at Idaho National Laboratory (INL) as part of the US Department of Energy’s Light Water Reactor Sustainability program to provide improved safety assessments of systems, components, and structures in nuclear power plants subjected to age-related degradation. Its goal is to provide an improved scientific basis for decisions surrounding license renewal, which would permit operation of commercial nuclear power plants beyond 60 years. Grizzly is based on INL’s MOOSE framework, which enables multiphysics simulations in a parallel computing environment. It will address a wide variety of aging issues in nuclear power plant systems, components, and structures, modelling both the aging processes and the ability of age-degraded components to perform safely. The reactor pressure vessel (RPV) was chosen as the initial application for Grizzly. Grizzly solves tightly coupled equations of heat conduction and solid mechanics to simulate the global response of the RPV to accident conditions, and uses submodels to represent regions with pre-existing flaws. Domain integrals are used to calculate stress intensity factors on those flaws. A physically based empirical model is used to evaluate material embrittlement, and is used to evaluate whether crack growth would occur. Grizzly can represent the RPV in 2D or 3D, allowing it to evaluate effects that require higher dimensionality models to capture. Work is underway to use lower length scale models of material evolution to inform engineering models of embrittlement. This paper demonstrates an application of Grizzly to RPV failure assessment, and summarizes on-going work.

  16. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    SciTech Connect

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with general

  17. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    SciTech Connect

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with general

  18. A development and integration of database code-system with a compilation of comparator, k0 and absolute methods for INAA using microsoft access

    NASA Astrophysics Data System (ADS)

    Hoh, Siew Sin; Rapie, Nurul Nadiah; Lim, Edwin Suh Wen; Tan, Chun Yuan; Yavar, Alireza; Sarmani, Sukiman; Majid, Amran Ab.; Khoo, Kok Siong

    2013-05-01

    Instrumental Neutron Activation Analysis (INAA) is often used to determine and calculate the elemental concentrations of a sample at The National University of Malaysia (UKM) typically in Nuclear Science Programme, Faculty of Science and Technology. The objective of this study was to develop a database code-system based on Microsoft Access 2010 which could help the INAA users to choose either comparator method, k0-method or absolute method for calculating the elemental concentrations of a sample. This study also integrated k0data, Com-INAA, k0Concent, k0-Westcott and Abs-INAA to execute and complete the ECC-UKM database code-system. After the integration, a study was conducted to test the effectiveness of the ECC-UKM database code-system by comparing the concentrations between the experiments and the code-systems. 'Triple Bare Monitor' Zr-Au and Cr-Mo-Au were used in k0Concent, k0-Westcott and Abs-INAA code-systems as monitors to determine the thermal to epithermal neutron flux ratio (f). Calculations involved in determining the concentration were net peak area (Np), measurement time (tm), irradiation time (tirr), k-factor (k), thermal to epithermal neutron flux ratio (f), parameters of the neutron flux distribution epithermal (α) and detection efficiency (ɛp). For Com-INAA code-system, certified reference material IAEA-375 Soil was used to calculate the concentrations of elements in a sample. Other CRM and SRM were also used in this database codesystem. Later, a verification process to examine the effectiveness of the Abs-INAA code-system was carried out by comparing the sample concentrations between the code-system and the experiment. The results of the experimental concentration values of ECC-UKM database code-system were performed with good accuracy.

  19. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  20. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  1. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  2. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    SciTech Connect

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  3. Mercury + VisIt: Integration of a Real-Time Graphical Analysis Capability into a Monte Carlo Transport Code

    SciTech Connect

    O'Brien, M J; Procassini, R J; Joy, K I

    2009-03-09

    Validation of the problem definition and analysis of the results (tallies) produced during a Monte Carlo particle transport calculation can be a complicated, time-intensive processes. The time required for a person to create an accurate, validated combinatorial geometry (CG) or mesh-based representation of a complex problem, free of common errors such as gaps and overlapping cells, can range from days to weeks. The ability to interrogate the internal structure of a complex, three-dimensional (3-D) geometry, prior to running the transport calculation, can improve the user's confidence in the validity of the problem definition. With regard to the analysis of results, the process of extracting tally data from printed tables within a file is laborious and not an intuitive approach to understanding the results. The ability to display tally information overlaid on top of the problem geometry can decrease the time required for analysis and increase the user's understanding of the results. To this end, our team has integrated VisIt, a parallel, production-quality visualization and data analysis tool into Mercury, a massively-parallel Monte Carlo particle transport code. VisIt provides an API for real time visualization of a simulation as it is running. The user may select which plots to display from the VisIt GUI, or by sending VisIt a Python script from Mercury. The frequency at which plots are updated can be set and the user can visualize the simulation results as it is running.

  4. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  5. Integral Thermal-Hydraulics Tests for the Safety Evaluation of VVER-440/213 Nuclear Reactors and Safety Code Validation

    SciTech Connect

    Szabados, Laszlo

    2004-01-15

    The Paks nuclear power plant is equipped with pressurized water reactors of the VVER-440/213 type. These plants have a number of special features, namely, six-loop primary circuit, horizontal steam generators, loop seal in both hot and cold legs, safety injection tank setpoint pressure higher than secondary pressure, etc. As a consequence of the special design solutions, the transient behavior of such a reactor system is different from the usual pressurized water reactor system behavior. To study the transient behavior of these plants, the PMK-2 integral-type facility, a thermal-hydraulic model of the Paks nuclear power plant, was designed and constructed.A short description of the specific design solutions of the VVER-440/213-type plants is given with the modeling aspects and similarity criteria applied to the design of the PMK-2 facility. Since the startup of the facility in 1985, 48 experiments have been performed primarily in an international framework with the participation of several experts from European and overseas countries to study one- and two-phase natural circulation, loss-of-coolant accidents, special plant transients, and experiments in support of the accident management measures. The results of several experiments illustrate the system effects of special design solutions and the effectiveness of bleed-and-feed accident management measures. A brief commentary on the thermal-hydraulic system code validation is provided, and conclusions are offered.

  6. Cross-Beam Energy Transfer (CBET) Effect with Additional Ion Heating Integrated into the 2-D Hydrodynamics Code DRACO

    NASA Astrophysics Data System (ADS)

    Marozas, J. A.; Collins, T. J. B.

    2012-10-01

    The cross-beam energy transfer (CBET) effect causes pump and probe beams to exchange energy via stimulated Brillouin scattering.footnotetext W. L. Kruer, The Physics of Laser--Plasma Interactions, Frontiers in Physics, Vol. 73, edited by D. Pines (Addison-Wesley, Redwood City, CA, 1988), p. 45. The total energy gained does not, in general, equate to the total energy lost; the ion-acoustic wave comprises the residual energy balance, which can decay, resulting in ion heating.footnotetext E. A. Williams et al., Phys. Plasmas 11, 231 (2004). The additional ion heating can retune the conditions for CBET affecting the overall energy transfer as a function of time. CBET and the additional ion heating are incorporated into the 2-D hydrodynamics code DRACOfootnotetext P. B. Radha et al., Phys. Plasmas 12, 056307 (2005). as an integral part of the 3-D ray trace where CBET is treated self-consistently within on the hydrodynamic evolution. DRACO simulation results employing CBET will be discussed. This work was supported by the U.S. Department of Energy Office of Inertial Confinement Fusion under Cooperative Agreement No. DE-FC52-08NA28302.

  7. ITS version 5.0 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2004-06-01

    ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  8. A 2×2 imaging MIMO system based on LED Visible Light Communications employing space balanced coding and integrated PIN array reception

    NASA Astrophysics Data System (ADS)

    Li, Jiehui; Xu, Yinfan; Shi, Jianyang; Wang, Yuanquan; Ji, Xinming; Ou, Haiyan; Chi, Nan

    2016-05-01

    In this paper, we proposed a 2×2 imaging Multi-Input Multi-Output (MIMO)-Visible Light Communication (VLC) system by employing Space Balanced Coding (SBC) based on two RGB LEDs and integrated PIN array reception. We experimentally demonstrated 1.4-Gbit/s VLC transmission at a distance of 2.5 m. The proposed imaging system not only overcomes the limitation of bandwidth existing in LEDs, but also can reject the second-order nonlinearity distortion. It turned out to be very promising to use integrated antennas in the VLC system in the future.

  9. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    PubMed Central

    Bosse, Stefan

    2015-01-01

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550

  10. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    PubMed

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  11. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-10-01

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  12. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-09-06

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  13. Abiding by codes of ethics and codes of conduct imposed on members of learned and professional geoscience institutions and - a tiresome formality or a win-win for scientific and professional integrity and protection of the public?

    NASA Astrophysics Data System (ADS)

    Allington, Ruth; Fernandez, Isabel

    2015-04-01

    In 2012, the International Union of Geological Sciences (IUGS) formed the Task Group on Global Geoscience Professionalism ("TG-GGP") to bring together the expanding network of organizations around the world whose primary purpose is self-regulation of geoscience practice. An important part of TG-GGP's mission is to foster a shared understanding of aspects of professionalism relevant to individual scientists and applied practitioners working in one or more sectors of the wider geoscience profession (e.g. research, teaching, industry, geoscience communication and government service). These may be summarised as competence, ethical practice, and professional, technical and scientific accountability. Legal regimes for the oversight of registered or licensed professionals differ around the world and in many jurisdictions there is no registration or licensure with the force of law. However, principles of peer-based self-regulation universally apply. This makes professional geoscience organisations ideal settings within which geoscientists can debate and agree what society should expect of us in the range of roles we fulfil. They can provide the structures needed to best determine what expectations, in the public interest, are appropriate for us collectively to impose on each other. They can also provide the structures for the development of associated procedures necessary to identify and discipline those who do not live up to the expected standards of behaviour established by consensus between peers. Codes of Ethics (sometimes referred to as Codes of Conduct), to which all members of all major professional and/or scientific geoscience organizations are bound (whether or not they are registered or hold professional qualifications awarded by those organisations), incorporate such traditional tenets as: safeguarding the health and safety of the public, scientific integrity, and fairness. Codes also increasingly include obligations concerning welfare of the environment and

  14. IN-MACA-MCC: Integrated Multiple Attractor Cellular Automata with Modified Clonal Classifier for Human Protein Coding and Promoter Prediction.

    PubMed

    Pokkuluri, Kiran Sree; Inampudi, Ramesh Babu; Nedunuri, S S S N Usha Devi

    2014-01-01

    Protein coding and promoter region predictions are very important challenges of bioinformatics (Attwood and Teresa, 2000). The identification of these regions plays a crucial role in understanding the genes. Many novel computational and mathematical methods are introduced as well as existing methods that are getting refined for predicting both of the regions separately; still there is a scope for improvement. We propose a classifier that is built with MACA (multiple attractor cellular automata) and MCC (modified clonal classifier) to predict both regions with a single classifier. The proposed classifier is trained and tested with Fickett and Tung (1992) datasets for protein coding region prediction for DNA sequences of lengths 54, 108, and 162. This classifier is trained and tested with MMCRI datasets for protein coding region prediction for DNA sequences of lengths 252 and 354. The proposed classifier is trained and tested with promoter sequences from DBTSS (Yamashita et al., 2006) dataset and nonpromoters from EID (Saxonov et al., 2000) and UTRdb (Pesole et al., 2002) datasets. The proposed model can predict both regions with an average accuracy of 90.5% for promoter and 89.6% for protein coding region predictions. The specificity and sensitivity values of promoter and protein coding region predictions are 0.89 and 0.92, respectively.

  15. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    SciTech Connect

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  16. A framework for identifying genotypic information from clinical records: exploiting integrated ontology structures to transfer annotations between ICD codes and Gene Ontologies.

    PubMed

    Hashemikhabir, Seyedsasan; Xia, Ran; Xiang, Yang; Janga, Sarath

    2015-09-18

    Although some methods are proposed for automatic ontology generation, none of them address the issue of integrating large-scale heterogeneous biomedical ontologies. We propose a novel approach for integrating various types of ontologies efficiently and apply it to integrate International Classification of Diseases, Ninth Revision, Clinical Modification (ICD9CM) and Gene Ontologies (GO). This approach is one of the early attempts to quantify the associations among clinical terms (e.g. ICD9 codes) based on their corresponding genomic relationships. We reconstructed a merged tree for a partial set of GO and ICD9 codes and measured the performance of this tree in terms of associations' relevance by comparing them with two well-known disease-gene datasets (i.e. MalaCards and Disease Ontology). Furthermore, we compared the genomic-based ICD9 associations to temporal relationships between them from electronic health records. Our analysis shows promising associations supported by both comparisons suggesting a high reliability. We also manually analyzed several significant associations and found promising support from literature.

  17. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea

    PubMed Central

    Bajaj, Deepak; Saxena, Maneesha S.; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D.; Gowda, C. L. L.; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K.; Parida, Swarup K.

    2015-01-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5′-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea. PMID:25504138

  18. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    PubMed

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea.

  19. iTOUGH2-IFC: An Integrated Flow Code in Support of Nagra's Probabilistic Safety Assessment:--User's Guide and Model Description

    SciTech Connect

    Finsterle, Stefan A.

    2009-01-02

    This document describes the development and use of the Integrated Flow Code (IFC), a numerical code and related model to be used for the simulation of time-dependent, two-phase flow in the near field and geosphere of a gas-generating nuclear waste repository system located in an initially fully water-saturated claystone (Opalinus Clay) in Switzerland. The development of the code and model was supported by the Swiss National Cooperative for the Disposal of Radioactive Waste (Nagra), Wettingen, Switzerland. Gas generation (mainly H{sub 2}, but also CH{sub 4} and CO{sub 2}) may affect repository performance by (1) compromising the engineered barriers through excessive pressure build-up, (2) displacing potentially contaminated pore water, (3) releasing radioactive gases (e.g., those containing {sup 14}C and {sup 3}H), (4) changing hydrogeologic properties of the engineered barrier system and the host rock, and (5) altering the groundwater flow field and thus radionuclide migration paths. The IFC aims at providing water and gas flow fields as the basis for the subsequent radionuclide transport simulations, which are performed by the radionuclide transport code (RTC). The IFC, RTC and a waste-dissolution and near-field transport model (STMAN) are part of the Integrated Radionuclide Release Code (IRRC), which integrates all safety-relevant features, events, and processes (FEPs). The IRRC is embedded into a Probabilistic Safety Assessment (PSA) computational tool that (1) evaluates alternative conceptual models, scenarios, and disruptive events, and (2) performs Monte-Carlo sampling to account for parametric uncertainties. The preliminary probabilistic safety assessment concept and the role of the IFC are visualized in Figure 1. The IFC was developed based on Nagra's PSA concept. Specifically, as many phenomena as possible are to be directly simulated using a (simplified) process model, which is at the core of the IRRC model. Uncertainty evaluation (scenario uncertainty

  20. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  1. Summary Report for ASC L2 Milestone #4782: Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes

    SciTech Connect

    Neely, J. R.; Hornung, R.; Black, A.; Robinson, P.

    2014-09-29

    This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completion signed by the review committee will act as proof of completion for this milestone.

  2. Evaluation of the heat transfer module (FAHT) of Failure Analysis Nonlinear Thermal And Structural Integrated Code (FANTASTIC)

    NASA Technical Reports Server (NTRS)

    Keyhani, Majid

    1989-01-01

    The heat transfer module of FANTASTIC Code (FAHT) is studied and evaluated to the extend possible during the ten weeks duration of this project. A brief background of the previous studies is given and the governing equations as modeled in FAHT are discussed. FAHT's capabilities and limitations based on these equations and its coding methodology are explained in detail. It is established that with improper choice of element size and time step FAHT's temperature field prediction at some nodes will be below the initial condition. The source of this unrealistic temperature prediction is identified and a procedure is proposed for avoiding this phenomenon. It is further shown that the proposed procedure will converge to an accurate prediction upon mesh refinement. Unfortunately due to lack of time FAHT's ability to accurately account for pyrolysis and surface ablation has not been verified. Therefore, at the present time it can be stated with confidence that FAHT can accurately predict the temperature field for a transient multi-dimensional, orthotropic material with directional dependence, variable property, with nonlinear boundary condition. Such a prediction will provide an upper limit for the temperature field in an ablating decomposing nozzle liner. The pore pressure field, however, will not be known.

  3. Towards a Local Integration of Theories: Codes and Praxeologies in the Case of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Gellert, Uwe; Barbe, Joaquim; Espinoza, Lorena

    2013-01-01

    We report on the development of a "language of description" that facilitates an integrated analysis of classroom video data in terms of the quality of the teaching-learning process and the students' access to valued forms of mathematical knowledge. Our research setting is the introduction of software for teachers for improving the mathematical…

  4. Towards a Local Integration of Theories: Codes and Praxeologies in the Case of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Gellert, Uwe; Barbe, Joaquim; Espinoza, Lorena

    2013-01-01

    We report on the development of a "language of description" that facilitates an integrated analysis of classroom video data in terms of the quality of the teaching-learning process and the students' access to valued forms of mathematical knowledge. Our research setting is the introduction of software for teachers for improving the mathematical…

  5. Linking deregulation of non-coding RNA to the core pathophysiology of Alzheimer's disease: an integrative review.

    PubMed

    Millan, Mark J

    2017-03-17

    The human genome encodes a vast repertoire of protein non-coding RNAs (ncRNA), some specific to the brain. MicroRNAs, which interfere with the translation of target mRNAs, are of particular interest since their deregulation has been implicated in neurodegenerative disorders like Alzheimer's disease (AD). However, it remains challenging to link the complex body of observations on miRNAs and AD into a coherent framework. Using extensive graphical support, this article discusses how a diverse panoply of miRNAs convergently and divergently impact (and are impacted by) core pathophysiological processes underlying AD: neuroinflammation and oxidative stress; aberrant generation of β-amyloid-42 (Aβ42); anomalies in the production, cleavage and post-translational marking of Tau; impaired clearance of Aβ42 and Tau; perturbation of axonal organisation; disruption of synaptic plasticity; endoplasmic reticulum stress and the unfolded protein response; mitochondrial dysfunction; aberrant induction of cell cycle re-entry; and apoptotic loss of neurons. Intriguingly, some classes of miRNA provoke these cellular anomalies, whereas others act in a counter-regulatory, protective mode. Moreover, changes in levels of certain species of miRNA are a consequence of the above-mentioned anomalies. In addition to miRNAs, circular RNAs, piwiRNAs, long non-coding RNAs and other types of ncRNA are being increasingly implicated in AD. Overall, a complex mesh of deregulated and multi-tasking ncRNAs reciprocally interacts with pathophysiological mechanisms underlying AD. Alterations in ncRNAs can be detected in CSF and the circulation as well as the brain, and are showing promise as biomarkers, with the ultimate goal clinical exploitation as targets for novel modes of symptomatic and course-altering therapy.

  6. Integrated High-Fidelity CFD/FE FSI Code Development and Benchmark Full-Scale Validation EFD for Slamming Analysis

    DTIC Science & Technology

    2016-06-30

    difference of the vertical velocity from the INS, which also lead to a vertical acceleration in this range. An estimate of the accuracy of the...first ut fi .tered at 100 Hz. The fouth graph is the acceleration as estimated with a finite difference of the verticcJ velocity from the JNS, and the...predict behavior at a different location. In Fig. 6 is shown vertical velocity at bulkhead #5 obtained in two different ways: by time integrating the

  7. Integrative analysis reveals clinical phenotypes and oncogenic potentials of long non-coding RNAs across 15 cancer types

    PubMed Central

    Piccolo, Stephen R.; Zhang, Xiao-Qin; Li, Jun-Hao; Zhou, Hui; Yang, Jian-Hua; Qu, Liang-Hu

    2016-01-01

    Long non-coding RNAs (lncRNAs) have been shown to contribute to tumorigenesis. However, surprisingly little is known about the comprehensive clinical and genomic characterization of lncRNAs across human cancer. In this study, we conducted comprehensive analyses for the expression profile, clinical outcomes, somatic copy number alterations (SCNAs) profile of lncRNAs in ~7000 clinical samples from 15 different cancer types. We identified significantly differentially expressed lncRNAs between tumor and normal tissues from each cancer. Notably, we characterized 47 lncRNAs which were extensively dysregulated in at least 10 cancer types, suggesting a conserved function in cancer development. We also analyzed the associations between lncRNA expressions and patient survival, and identified sets of lncRNAs that possessed significant prognostic values in specific cancer types. Our combined analysis of SCNA data and expression data uncovered 116 dysregulated lncRNAs are strikingly genomic altered across 15 cancer types, indicating their oncogenic potentials. Our study may lay the groundwork for future functional studies of lncRNAs and help facilitate the discovery of novel clinical biomarkers. PMID:27147563

  8. Development and application of a ray-tracing code integrating with 3D equilibrium mapping in LHD ECH experiments

    NASA Astrophysics Data System (ADS)

    Tsujimura, T., Ii; Kubo, S.; Takahashi, H.; Makino, R.; Seki, R.; Yoshimura, Y.; Igami, H.; Shimozuma, T.; Ida, K.; Suzuki, C.; Emoto, M.; Yokoyama, M.; Kobayashi, T.; Moon, C.; Nagaoka, K.; Osakabe, M.; Kobayashi, S.; Ito, S.; Mizuno, Y.; Okada, K.; Ejiri, A.; Mutoh, T.

    2015-11-01

    The central electron temperature has successfully reached up to 7.5 keV in large helical device (LHD) plasmas with a central high-ion temperature of 5 keV and a central electron density of 1.3× {{10}19} m-3. This result was obtained by heating with a newly-installed 154 GHz gyrotron and also the optimisation of injection geometry in electron cyclotron heating (ECH). The optimisation was carried out by using the ray-tracing code ‘LHDGauss’, which was upgraded to include the rapid post-processing three-dimensional (3D) equilibrium mapping obtained from experiments. For ray-tracing calculations, LHDGauss can automatically read the relevant data registered in the LHD database after a discharge, such as ECH injection settings (e.g. Gaussian beam parameters, target positions, polarisation and ECH power) and Thomson scattering diagnostic data along with the 3D equilibrium mapping data. The equilibrium map of the electron density and temperature profiles are then extrapolated into the region outside the last closed flux surface. Mode purity, or the ratio between the ordinary mode and the extraordinary mode, is obtained by calculating the 1D full-wave equation along the direction of the rays from the antenna to the absorption target point. Using the virtual magnetic flux surfaces, the effects of the modelled density profiles and the magnetic shear at the peripheral region with a given polarisation are taken into account. Power deposition profiles calculated for each Thomson scattering measurement timing are registered in the LHD database. The adjustment of the injection settings for the desired deposition profile from the feedback provided on a shot-by-shot basis resulted in an effective experimental procedure.

  9. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    SciTech Connect

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert; McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  10. A statistical framework to predict functional non-coding regions in the human genome through integrated analysis of annotation data.

    PubMed

    Lu, Qiongshi; Hu, Yiming; Sun, Jiehuan; Cheng, Yuwei; Cheung, Kei-Hoi; Zhao, Hongyu

    2015-05-27

    Identifying functional regions in the human genome is a major goal in human genetics. Great efforts have been made to functionally annotate the human genome either through computational predictions, such as genomic conservation, or high-throughput experiments, such as the ENCODE project. These efforts have resulted in a rich collection of functional annotation data of diverse types that need to be jointly analyzed for integrated interpretation and annotation. Here we present GenoCanyon, a whole-genome annotation method that performs unsupervised statistical learning using 22 computational and experimental annotations thereby inferring the functional potential of each position in the human genome. With GenoCanyon, we are able to predict many of the known functional regions. The ability of predicting functional regions as well as its generalizable statistical framework makes GenoCanyon a unique and powerful tool for whole-genome annotation. The GenoCanyon web server is available at http://genocanyon.med.yale.edu.

  11. Rewriting the Genetic Code.

    PubMed

    Mukai, Takahito; Lajoie, Marc J; Englert, Markus; Söll, Dieter

    2017-09-08

    The genetic code-the language used by cells to translate their genomes into proteins that perform many cellular functions-is highly conserved throughout natural life. Rewriting the genetic code could lead to new biological functions such as expanding protein chemistries with noncanonical amino acids (ncAAs) and genetically isolating synthetic organisms from natural organisms and viruses. It has long been possible to transiently produce proteins bearing ncAAs, but stabilizing an expanded genetic code for sustained function in vivo requires an integrated approach: creating recoded genomes and introducing new translation machinery that function together without compromising viability or clashing with endogenous pathways. In this review, we discuss design considerations and technologies for expanding the genetic code. The knowledge obtained by rewriting the genetic code will deepen our understanding of how genomes are designed and how the canonical genetic code evolved.

  12. Noiseless Coding Of Magnetometer Signals

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1989-01-01

    Report discusses application of noiseless data-compression coding to digitized readings of spaceborne magnetometers for transmission back to Earth. Objective of such coding to increase efficiency by decreasing rate of transmission without sacrificing integrity of data. Adaptive coding compresses data by factors ranging from 2 to 6.

  13. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  14. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  15. Space and Terrestrial Power System Integration Optimization Code BRMAPS for Gas Turbine Space Power Plants With Nuclear Reactor Heat Sources

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2007-01-01

    In view of the difficult times the US and global economies are experiencing today, funds for the development of advanced fission reactors nuclear power systems for space propulsion and planetary surface applications are currently not available. However, according to the Energy Policy Act of 2005 the U.S. needs to invest in developing fission reactor technology for ground based terrestrial power plants. Such plants would make a significant contribution toward drastic reduction of worldwide greenhouse gas emissions and associated global warming. To accomplish this goal the Next Generation Nuclear Plant Project (NGNP) has been established by DOE under the Generation IV Nuclear Systems Initiative. Idaho National Laboratory (INL) was designated as the lead in the development of VHTR (Very High Temperature Reactor) and HTGR (High Temperature Gas Reactor) technology to be integrated with MMW (multi-megawatt) helium gas turbine driven electric power AC generators. However, the advantages of transmitting power in high voltage DC form over large distances are also explored in the seminar lecture series. As an attractive alternate heat source the Liquid Fluoride Reactor (LFR), pioneered at ORNL (Oak Ridge National Laboratory) in the mid 1960's, would offer much higher energy yields than current nuclear plants by using an inherently safe energy conversion scheme based on the Thorium --> U233 fuel cycle and a fission process with a negative temperature coefficient of reactivity. The power plants are to be sized to meet electric power demand during peak periods and also for providing thermal energy for hydrogen (H2) production during "off peak" periods. This approach will both supply electric power by using environmentally clean nuclear heat which does not generate green house gases, and also provide a clean fuel H2 for the future, when, due to increased global demand and the decline in discovering new deposits, our supply of liquid fossil fuels will have been used up. This is

  16. High Efficiency Integrated Space Conditioning, Water Heating and Air Distribution System for HUD-Code Manufactured Housing

    SciTech Connect

    Henry DeLima; Joe Akin; Joseph Pietsch

    2008-09-14

    Recognizing the need for new space conditioning and water heating systems for manufactured housing, DeLima Associates assembled a team to develop a space conditioning system that would enhance comfort conditions while also reducing energy usage at the systems level. The product, Comboflair® was defined as a result of a needs analysis of project sponsors and industry stakeholders. An integrated system would be developed that would combine a packaged airconditioning system with a small-duct, high-velocity air distribution system. In its basic configuration, the source for space heating would be a gas water heater. The complete system would be installed at the manufactured home factory and would require no site installation work at the homesite as is now required with conventional split-system air conditioners. Several prototypes were fabricated and tested before a field test unit was completed in October 2005. The Comboflair® system, complete with ductwork, was installed in a 1,984 square feet, double-wide manufactured home built by Palm Harbor Homes in Austin, TX. After the home was transported and installed at a Palm Harbor dealer lot in Austin, TX, a data acquisition system was installed for remote data collection. Over 60 parameters were continuously monitored and measurements were transmitted to a remote site every 15 minutes for performance analysis. The Comboflair® system was field tested from February 2006 until April 2007. The cooling system performed in accordance with the design specifications. The heating system initially could not provide the needed capacity at peak heating conditions until the water heater was replaced with a higher capacity standard water heater. All system comfort goals were then met. As a result of field testing, we have identified improvements to be made to specific components for incorporation into production models. The Comboflair® system will be manufactured by Unico, Inc. at their new production facility in St. Louis

  17. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  18. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  19. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  20. Integrated Analysis of the Roles of Long Noncoding RNA and Coding RNA Expression in Sheep (Ovis aries) Skin during Initiation of Secondary Hair Follicle.

    PubMed

    Yue, Yaojing; Guo, Tingting; Yuan, Chao; Liu, Jianbin; Guo, Jian; Feng, Ruilin; Niu, Chune; Sun, Xiaoping; Yang, Bohui

    2016-01-01

    Initiation of hair follicle (HF) is the first and most important stage of HF morphogenesis. However the precise molecular mechanism of initiation of hair follicle remains elusive. Meanwhile, in previous study, the more attentions had been paid to the function of genes, while the roles of non-coding RNAs (such as long noncoding RNA and microRNA) had not been described. Therefore, the roles of long noncoding RNA(LncRNA) and coding RNA in sheep skin during the initiation of sheep secondary HF were integrated and analyzed, by using strand-specific RNA sequencing (ssRNA-seq).A total of 192 significant differentially expressed genes were detected, including 67 up-regulated genes and 125 down-regulated genes between stage 0 and stage 1 of HF morphogenesis during HF initiation. Only Wnt2, FGF20 were just significant differentially expressed among Wnt, Shh, Notch and BMP signaling pathways. Further expression profile analysis of lncRNAs showed that 884 novel lncRNAs were discovered in sheep skin expression profiles. A total of 15 lncRNAs with significant differential expression were detected, 6 up-regulated and 9 down-regulated. Among of differentially expressed genes and LncRNA, XLOC002437 lncRNA and potential target gene COL6A6 were all significantly down-regulated in stage 1. Furthermore, by using RNAhybrid, XLOC005698 may be as a competing endogenous RNA ''sponges" oar-miR-3955-5p activity. Gene Ontology and KEGG pathway analyses indicated that the significantly enriched pathway was peroxisome proliferator-activated receptors (PPARs) pathway (corrected P-value < 0.05), indicating that PPAR pathway is likely to play significant roles during the initiation of secondary HF.Results suggest that the key differentially expressed genes and LncRNAs may be considered as potential candidate genes for further study on the molecular mechanisms of HF initiation, as well as supplying some potential values for understanding human hair disorders.

  1. Integrated Analysis of the Roles of Long Noncoding RNA and Coding RNA Expression in Sheep (Ovis aries) Skin during Initiation of Secondary Hair Follicle

    PubMed Central

    Liu, Jianbin; Guo, Jian; Feng, Ruilin; Niu, Chune; Sun, Xiaoping; Yang, Bohui

    2016-01-01

    Initiation of hair follicle (HF) is the first and most important stage of HF morphogenesis. However the precise molecular mechanism of initiation of hair follicle remains elusive. Meanwhile, in previous study, the more attentions had been paid to the function of genes, while the roles of non-coding RNAs (such as long noncoding RNA and microRNA) had not been described. Therefore, the roles of long noncoding RNA(LncRNA) and coding RNA in sheep skin during the initiation of sheep secondary HF were integrated and analyzed, by using strand-specific RNA sequencing (ssRNA-seq).A total of 192 significant differentially expressed genes were detected, including 67 up-regulated genes and 125 down-regulated genes between stage 0 and stage 1 of HF morphogenesis during HF initiation. Only Wnt2, FGF20 were just significant differentially expressed among Wnt, Shh, Notch and BMP signaling pathways. Further expression profile analysis of lncRNAs showed that 884 novel lncRNAs were discovered in sheep skin expression profiles. A total of 15 lncRNAs with significant differential expression were detected, 6 up-regulated and 9 down-regulated. Among of differentially expressed genes and LncRNA, XLOC002437 lncRNA and potential target gene COL6A6 were all significantly down-regulated in stage 1. Furthermore, by using RNAhybrid, XLOC005698 may be as a competing endogenous RNA ‘‘sponges” oar-miR-3955-5p activity. Gene Ontology and KEGG pathway analyses indicated that the significantly enriched pathway was peroxisome proliferator-activated receptors (PPARs) pathway (corrected P-value < 0.05), indicating that PPAR pathway is likely to play significant roles during the initiation of secondary HF.Results suggest that the key differentially expressed genes and LncRNAs may be considered as potential candidate genes for further study on the molecular mechanisms of HF initiation, as well as supplying some potential values for understanding human hair disorders. PMID:27276011

  2. [Integrity].

    PubMed

    Gómez Rodríguez, Rafael Ángel

    2014-01-01

    To say that someone possesses integrity is to claim that that person is almost predictable about responses to specific situations, that he or she can prudentially judge and to act correctly. There is a closed interrelationship between integrity and autonomy, and the autonomy rests on the deeper moral claim of all humans to integrity of the person. Integrity has two senses of significance for medical ethic: one sense refers to the integrity of the person in the bodily, psychosocial and intellectual elements; and in the second sense, the integrity is the virtue. Another facet of integrity of the person is la integrity of values we cherish and espouse. The physician must be a person of integrity if the integrity of the patient is to be safeguarded. The autonomy has reduced the violations in the past, but the character and virtues of the physician are the ultimate safeguard of autonomy of patient. A field very important in medicine is the scientific research. It is the character of the investigator that determines the moral quality of research. The problem arises when legitimate self-interests are replaced by selfish, particularly when human subjects are involved. The final safeguard of moral quality of research is the character and conscience of the investigator. Teaching must be relevant in the scientific field, but the most effective way to teach virtue ethics is through the example of the a respected scientist.

  3. DNA codes

    SciTech Connect

    Torney, D. C.

    2001-01-01

    We have begun to characterize a variety of codes, motivated by potential implementation as (quaternary) DNA n-sequences, with letters denoted A, C The first codes we studied are the most reminiscent of conventional group codes. For these codes, Hamming similarity was generalized so that the score for matched letters takes more than one value, depending upon which letters are matched [2]. These codes consist of n-sequences satisfying an upper bound on the similarities, summed over the letter positions, of distinct codewords. We chose similarity 2 for matches of letters A and T and 3 for matches of the letters C and G, providing a rough approximation to double-strand bond energies in DNA. An inherent novelty of DNA codes is 'reverse complementation'. The latter may be defined, as follows, not only for alphabets of size four, but, more generally, for any even-size alphabet. All that is required is a matching of the letters of the alphabet: a partition into pairs. Then, the reverse complement of a codeword is obtained by reversing the order of its letters and replacing each letter by its match. For DNA, the matching is AT/CG because these are the Watson-Crick bonding pairs. Reversal arises because two DNA sequences form a double strand with opposite relative orientations. Thus, as will be described in detail, because in vitro decoding involves the formation of double-stranded DNA from two codewords, it is reasonable to assume - for universal applicability - that the reverse complement of any codeword is also a codeword. In particular, self-reverse complementary codewords are expressly forbidden in reverse-complement codes. Thus, an appropriate distance between all pairs of codewords must, when large, effectively prohibit binding between the respective codewords: to form a double strand. Only reverse-complement pairs of codewords should be able to bind. For most applications, a DNA code is to be bi-partitioned, such that the reverse-complementary pairs are separated

  4. Code of Ethics.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC.

    The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

  5. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  6. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  7. Integration

    ERIC Educational Resources Information Center

    Kalyn, Brenda

    2006-01-01

    Integrated learning is an exciting adventure for both teachers and students. It is not uncommon to observe the integration of academic subjects such as math, science, and language arts. However, educators need to recognize that movement experiences in physical education also can be linked to academic curricula and, may even lead the…

  8. Coding for Single-Line Transmission

    NASA Technical Reports Server (NTRS)

    Madison, L. G.

    1983-01-01

    Digital transmission code combines data and clock signals into single waveform. MADCODE needs four standard integrated circuits in generator and converter plus five small discrete components. MADCODE allows simple coding and decoding for transmission of digital signals over single line.

  9. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  10. ITS version 5.0 :the integrated TIGER series of coupled electron/Photon monte carlo transport codes with CAD geometry.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2005-09-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  11. Nature's Code

    NASA Astrophysics Data System (ADS)

    Hill, Vanessa J.; Rowlands, Peter

    2008-10-01

    We propose that the mathematical structures related to the `universal rewrite system' define a universal process applicable to Nature, which we may describe as `Nature's code'. We draw attention here to such concepts as 4 basic units, 64- and 20-unit structures, symmetry-breaking and 5-fold symmetry, chirality, double 3-dimensionality, the double helix, the Van der Waals force and the harmonic oscillator mechanism, and our explanation of how they necessarily lead to self-aggregation, complexity and emergence in higher-order systems. Biological concepts, such as translation, transcription, replication, the genetic code and the grouping of amino acids appear to be driven by fundamental processes of this kind, and it would seem that the Platonic solids, pentagonal symmetry and Fibonacci numbers have significant roles in organizing `Nature's code'.

  12. Show Code.

    PubMed

    Shalev, Daniel

    2017-01-01

    "Let's get one thing straight: there is no such thing as a show code," my attending asserted, pausing for effect. "You either try to resuscitate, or you don't. None of this halfway junk." He spoke so loudly that the two off-service consultants huddled at computers at the end of the unit looked up… We did four rounds of compressions and pushed epinephrine twice. It was not a long code. We did good, strong compressions and coded this man in earnest until the end. Toward the final round, though, as I stepped up to do compressions, my attending looked at me in a deep way. It was a look in between willing me as some object under his command and revealing to me everything that lay within his brash, confident surface but could not be spoken. © 2017 The Hastings Center.

  13. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  14. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  15. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  16. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  17. OpenSWPC: an open-source integrated parallel simulation code for modeling seismic wave propagation in 3D heterogeneous viscoelastic media

    NASA Astrophysics Data System (ADS)

    Maeda, Takuto; Takemura, Shunsuke; Furumura, Takashi

    2017-07-01

    We have developed an open-source software package, Open-source Seismic Wave Propagation Code (OpenSWPC), for parallel numerical simulations of seismic wave propagation in 3D and 2D (P-SV and SH) viscoelastic media based on the finite difference method in local-to-regional scales. This code is equipped with a frequency-independent attenuation model based on the generalized Zener body and an efficient perfectly matched layer for absorbing boundary condition. A hybrid-style programming using OpenMP and the Message Passing Interface (MPI) is adopted for efficient parallel computation. OpenSWPC has wide applicability for seismological studies and great portability to allowing excellent performance from PC clusters to supercomputers. Without modifying the code, users can conduct seismic wave propagation simulations using their own velocity structure models and the necessary source representations by specifying them in an input parameter file. The code has various modes for different types of velocity structure model input and different source representations such as single force, moment tensor and plane-wave incidence, which can easily be selected via the input parameters. Widely used binary data formats, the Network Common Data Form (NetCDF) and the Seismic Analysis Code (SAC) are adopted for the input of the heterogeneous structure model and the outputs of the simulation results, so users can easily handle the input/output datasets. All codes are written in Fortran 2003 and are available with detailed documents in a public repository.[Figure not available: see fulltext.

  18. Comparison of measured responses in two spectrally-sensitive x-ray detectors to predictions obtained using the ITS (Integrated Tiger Series) radiation transport code

    SciTech Connect

    Carlson, G.A.; Beutler, D.E.; Seager, K.D.; Knott, D.P.

    1988-01-01

    Responses of a Ge detector and a filtered TLD array detector have been measured at a steady-state bremsstrahlung source (the Pelletron), at endpoint energies from 150 to 900 keV. Predictions of detector response using Monte Carlo ITS codes are found to be in excellent agreement with measured response for both detectors. These results extend the range of validity of the ITS codes. With calibration provided by these experiments and by ITS predictions, dose-depth data from the TLD arrays can be used to estimate flash x-ray source endpoint energies.

  19. Induction technology optimization code

    SciTech Connect

    Caporaso, G.J.; Brooks, A.L.; Kirbie, H.C.

    1992-08-21

    A code has been developed to evaluate relative costs of induction accelerator driver systems for relativistic klystrons. The code incorporates beam generation, transport and pulsed power system constraints to provide an integrated design tool. The code generates an injector/accelerator combination which satisfies the top level requirements and all system constraints once a small number of design choices have been specified (rise time of the injector voltage and aspect ratio of the ferrite induction cores, for example). The code calculates dimensions of accelerator mechanical assemblies and values of all electrical components. Cost factors for machined parts, raw materials and components are applied to yield a total system cost. These costs are then plotted as a function of the two design choices to enable selection of an optimum design based on various criteria. The Induction Technology Optimization Study (ITOS) was undertaken to examine viable combinations of a linear induction accelerator and a relativistic klystron (RK) for high power microwave production. It is proposed, that microwaves from the RK will power a high-gradient accelerator structure for linear collider development. Previous work indicates that the RK will require a nominal 3-MeV, 3-kA electron beam with a 100-ns flat top. The proposed accelerator-RK combination will be a high average power system capable of sustained microwave output at a 300-Hz pulse repetition frequency. The ITOS code models many combinations of injector, accelerator, and pulse power designs that will supply an RK with the beam parameters described above.

  20. Code Green.

    ERIC Educational Resources Information Center

    McMinn, John

    2002-01-01

    Assesses the integrated approach to green design in the new Computer Science Building at Toronto's York University. The building design fulfills the university's demand to combine an energy efficient design with sustainability. Floor and site plans are included. (GR)

  1. Generic reactive transport codes as flexible tools to integrate soil organic matter degradation models with water, transport and geochemistry in soils

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand

    2016-04-01

    A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter

  2. Generic reactive transport codes as flexible tools to integrate soil organic matter degradation models with water, transport and geochemistry in soils

    NASA Astrophysics Data System (ADS)

    Simunek, J.; Jacques, D.; Mayer, K. U.; Gerard, F.

    2016-12-01

    A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases hard-coded in terms of pools, kinetics and dependency on environmental variables. The input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, sorption and colloid-facilitated transport are not incorporated in many of these models. Furthermore, these models are combined with simplified representations of flow and transport processes. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes and/or kinetic reaction networks. The flexibility of these type of codes allows for straightforward extension of reaction networks with new model components and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow, solute transport, heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter and inorganic carbon in the aqueous and gaseous phases, as well as different decomposition functions with first-order, linear dependence or nonlinear dependence on a biomass pool. In addition, we show how processes such as local bioturbation (biodiffusion) can be included implicitly through a Fickian formulation of transport of soil organic matter. Coupling soil organic

  3. IMP: A performance code

    NASA Technical Reports Server (NTRS)

    Dauro, Vincent A., Sr.

    1991-01-01

    IMP (Integrated Mission Program) is a simulation language and code used to model present and future Earth, Moon, or Mars missions. The profile is user controlled through selection from a large menu of events and maneuvers. A Fehlberg 7/13 Runge-Kutta integrator with error and step size control is used to numerically integrate the differential equations of motion (DEQ) of three spacecraft, a main, a target, and an observer. Through selection, the DEQ's include guided thrust, oblate gravity, atmosphere drag, solar pressure, and Moon gravity effects. Guide parameters for thrust events and performance parameters of velocity changes (Delta-V) and propellant usage (maximum of five systems) are developed as needed. Print, plot, summary, and debug files are output.

  4. National Combustion Code: Parallel Performance

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2001-01-01

    This report discusses the National Combustion Code (NCC). The NCC is an integrated system of codes for the design and analysis of combustion systems. The advanced features of the NCC meet designers' requirements for model accuracy and turn-around time. The fundamental features at the inception of the NCC were parallel processing and unstructured mesh. The design and performance of the NCC are discussed.

  5. Integration of a code for aeroelastic design of conventional and composite wings into ACSYNT, an aircraft synthesis program. [wing aeroelastic design (WADES)

    NASA Technical Reports Server (NTRS)

    Mullen, J., Jr.

    1976-01-01

    A comparison of program estimates of wing weight, material distribution. structural loads and elastic deformations with actual Northrop F-5A/B data is presented. Correlation coefficients obtained using data from a number of existing aircraft were computed for use in vehicle synthesis to estimate wing weights. The modifications necessary to adapt the WADES code for use in the ACSYNT program are described. Basic program flow and overlay structure is outlined. An example of the convergence of the procedure in estimating wing weights during the synthesis of a vehicle to satisfy F-5 mission requirements is given. A description of inputs required for use of the WADES program is included.

  6. A general multiblock Euler code for propulsion integration. Volume 2: User guide for BCON, pre-processor for grid generation and GMBE

    NASA Technical Reports Server (NTRS)

    Su, T. Y.; Appleby, R. A.; Chen, H. C.

    1991-01-01

    The BCON is a menu-driven graphics interface program whose input consists of strings or arrays of points generated from a computer aided design (CAD) tool or any other surface geometry source. The user needs to design the block topology and prepare the surface geometry definition and surface grids separately. The BCON generates input files that contain the block definitions and the block relationships required for generating a multiblock volume grid with the EAGLE grid generation package. The BCON also generates the block boundary conditions file which is used along with the block relationship file as input for the general multiblock Euler (GMBE) code (GMBE, volumes 1 and 3).

  7. Study Drugs and Academic Integrity: The Role of Beliefs about an Academic Honor Code in the Prediction of Nonmedical Prescription Drug Use for Academic Enhancement

    ERIC Educational Resources Information Center

    Reisinger, Kelsy B.; Rutledge, Patricia C.; Conklin, Sarah M.

    2016-01-01

    The role of beliefs about academic integrity in college students' decisions to use nonmedical prescription drugs (NMPDs) in academic settings was examined. In Spring 2012 the authors obtained survey data from 645 participants at a small, undergraduate, private liberal arts institution in the Northeastern United States. A broadcast e-mail message…

  8. Study Drugs and Academic Integrity: The Role of Beliefs about an Academic Honor Code in the Prediction of Nonmedical Prescription Drug Use for Academic Enhancement

    ERIC Educational Resources Information Center

    Reisinger, Kelsy B.; Rutledge, Patricia C.; Conklin, Sarah M.

    2016-01-01

    The role of beliefs about academic integrity in college students' decisions to use nonmedical prescription drugs (NMPDs) in academic settings was examined. In Spring 2012 the authors obtained survey data from 645 participants at a small, undergraduate, private liberal arts institution in the Northeastern United States. A broadcast e-mail message…

  9. Video coding with dynamic background

    NASA Astrophysics Data System (ADS)

    Paul, Manoranjan; Lin, Weisi; Lau, Chiew Tong; Lee, Bu-Sung

    2013-12-01

    Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

  10. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  11. Crossing Disciplinary Lines--Bar Codes and DNA Codes.

    ERIC Educational Resources Information Center

    Liao, Thomas T.

    1997-01-01

    Discusses strategies that enable students to learn ideas and concepts in the context of how modern communication technology is designed and operates. Describes a course that integrates the study of math, science, and technology into topics that are engaging to students. Presents an activity that introduces students to digital coding and compares…

  12. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  13. OVERAERO-MPI: Parallel Overset Aeroelasticity Code

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Rizk, Yehia M.

    1999-01-01

    An overset modal structures analysis code was integrated with a parallel overset Navier-Stokes flow solver to obtain a code capable of static aeroelastic computations. The new code was used to compute the static aeroelastic deformation of an arrow-wing-body geometry and a complex, full aircraft configuration. For the simple geometry, the results were similar to the results obtained with the ENSAERO code and the PVM version of OVERAERO. The full potential of this code suite was illustrated in the complex, full aircraft computations.

  14. Integrating the intrinsic conformational preferences of non-coded α-amino acids modified at the peptide bond into the NCAD database

    PubMed Central

    Revilla-López, Guillem; Rodríguez-Ropero, Francisco; Curcó, David; Torras, Juan; Calaza, M. Isabel; Zanuy, David; Jiménez, Ana I.; Cativiela, Carlos; Nussinov, Ruth; Alemán, Carlos

    2011-01-01

    Recently, we reported a database (NCAD, Non-Coded Amino acids Database; http://recerca.upc.edu/imem/index.htm) that was built to compile information about the intrinsic conformational preferences of non-proteinogenic residues determined by quantum mechanical calculations, as well as bibliographic information about their synthesis, physical and spectroscopic characterization, the experimentally-established conformational propensities, and applications (J. Phys. Chem. B 2010, 114, 7413). The database initially contained the information available for α-tetrasubstituted α-amino acids. In this work, we extend NCAD to three families of compounds, which can be used to engineer peptides and proteins incorporating modifications at the –NHCO– peptide bond. Such families are: N-substituted α-amino acids, thio-α-amino acids, and diamines and diacids used to build retropeptides. The conformational preferences of these compounds have been analyzed and described based on the information captured in the database. In addition, we provide an example of the utility of the database and of the compounds it compiles in protein and peptide engineering. Specifically, the symmetry of a sequence engineered to stabilize the 310-helix with respect to the α-helix has been broken without perturbing significantly the secondary structure through targeted replacements using the information contained in the database. PMID:21491493

  15. The moving mesh code SHADOWFAX

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; De Rijcke, S.

    2016-07-01

    We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  16. ETR/ITER systems code

    SciTech Connect

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  17. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  18. Implementation of a Variable Stepsize Variable Formula Method in the Time-Integration Part of a Code for Treatment of Long-Range Transport of Air Pollutants

    NASA Astrophysics Data System (ADS)

    Zlatev, Zahari; Berkowicz, Ruwim; Prahm, Lars P.

    1984-08-01

    A mathematical model consisting of two partial differential equations is used to study the long-range transport of sulphur di-oxide and sulphate over Europe. The discretization of the first-order space derivatives ( the advection terms) is carried out by a pseudospectral (Fourier) algorithm. A special technique is applied in the discretization of the second-order space derivatives ( the diffusion terms). Two large systems of ordinary differential equations are solved at each time-step. It is shown that these systems can efficiently be treated by a variable stepsize variable formula method (VSVFM) based on the use of predictor-corrector schemes. The stepsize selection strategy and the formula selection strategy are discussed in detail. An attempt to carry out both an accuracy control and a stability control is made at each time-step. The great efficiency of the VSVFM implemented in our software as well as the reliability of the results are illustrated by numerical experiments, in which real meteorological data (for 1979) at the grid-points of a space domain covering the whole of Europe were used. The main ideas, implemented in the time-integration part, might be applied in many other situations, where the systems of ordinary differential equations arising after the space discretization are only moderately stiff (so that the stability requirements are dominant over the accuracy requirements on a large part of the time-interval but the use of implicit time-integration algorithms that require solving systems of algebraic equations at each time-step is not justified). As an illustration only it should be mentioned that such an application has been carried out in connection with models describing long-range transport of nitrogen pollutants over Europe.

  19. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  20. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  1. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  2. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  3. To Code or Not To Code?

    ERIC Educational Resources Information Center

    Parkinson, Brian; Sandhu, Parveen; Lacorte, Manel; Gourlay, Lesley

    1998-01-01

    This article considers arguments for and against the use of coding systems in classroom-based language research and touches on some relevant considerations from ethnographic and conversational analysis approaches. The four authors each explain and elaborate on their practical decision to code or not to code events or utterances at a specific point…

  4. PANEL CODE FOR PLANAR CASCADES

    NASA Technical Reports Server (NTRS)

    Mcfarland, E. R.

    1994-01-01

    The Panel Code for Planar Cascades was developed as an aid for the designer of turbomachinery blade rows. The effective design of turbomachinery blade rows relies on the use of computer codes to model the flow on blade-to-blade surfaces. Most of the currently used codes model the flow as inviscid, irrotational, and compressible with solutions being obtained by finite difference or finite element numerical techniques. While these codes can yield very accurate solutions, they usually require an experienced user to manipulate input data and control parameters. Also, they often limit a designer in the types of blade geometries, cascade configurations, and flow conditions that can be considered. The Panel Code for Planar Cascades accelerates the design process and gives the designer more freedom in developing blade shapes by offering a simple blade-to-blade flow code. Panel, or integral equation, solution techniques have been used for several years by external aerodynamicists who have developed and refined them into a primary design tool of the aircraft industry. The Panel Code for Planar Cascades adapts these same techniques to provide a versatile, stable, and efficient calculation scheme for internal flow. The code calculates the compressible, inviscid, irrotational flow through a planar cascade of arbitrary blade shapes. Since the panel solution technique is for incompressible flow, a compressibility correction is introduced to account for compressible flow effects. The analysis is limited to flow conditions in the subsonic and shock-free transonic range. Input to the code consists of inlet flow conditions, blade geometry data, and simple control parameters. Output includes flow parameters at selected control points. This program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 590K of 8 bit bytes. This program was developed in 1982.

  5. GalPot: Galaxy potential code

    NASA Astrophysics Data System (ADS)

    McMillan, Paul J.

    2016-11-01

    GalPot finds the gravitational potential associated with axisymmetric density profiles. The package includes code that performs transformations between commonly used coordinate systems for both positions and velocities (the class OmniCoords), and that integrates orbits in the potentials. GalPot is a stand-alone version of Walter Dehnen's GalaxyPotential C++ code taken from the falcON code in the NEMO Stellar Dynamics Toolbox (ascl:1010.051).

  6. Molecular phylogeny of 21 tropical bamboo species reconstructed by integrating non-coding internal transcribed spacer (ITS1 and 2) sequences and their consensus secondary structure.

    PubMed

    Ghosh, Jayadri Sekhar; Bhattacharya, Samik; Pal, Amita

    2017-06-01

    The unavailability of the reproductive structure and unpredictability of vegetative characters for the identification and phylogenetic study of bamboo prompted the application of molecular techniques for greater resolution and consensus. We first employed internal transcribed spacer (ITS1, 5.8S rRNA and ITS2) sequences to construct the phylogenetic tree of 21 tropical bamboo species. While the sequence alone could grossly reconstruct the traditional phylogeny amongst the 21-tropical species studied, some anomalies were encountered that prompted a further refinement of the phylogenetic analyses. Therefore, we integrated the secondary structure of the ITS sequences to derive individual sequence-structure matrix to gain more resolution on the phylogenetic reconstruction. The results showed that ITS sequence-structure is the reliable alternative to the conventional phenotypic method for the identification of bamboo species. The best-fit topology obtained by the sequence-structure based phylogeny over the sole sequence based one underscores closer clustering of all the studied Bambusa species (Sub-tribe Bambusinae), while Melocanna baccifera, which belongs to Sub-Tribe Melocanneae, disjointedly clustered as an out-group within the consensus phylogenetic tree. In this study, we demonstrated the dependability of the combined (ITS sequence+structure-based) approach over the only sequence-based analysis for phylogenetic relationship assessment of bamboo.

  7. Bare Code Reader

    NASA Astrophysics Data System (ADS)

    Clair, Jean J.

    1980-05-01

    The Bare code system will be used, in every market and supermarket. The code, which is normalised in US and Europe (code EAN) gives informations on price, storage, nature and allows in real time the gestion of theshop.

  8. Compact, Flexible Telemetry-Coding Circuits

    NASA Technical Reports Server (NTRS)

    Katz, Richard B.; Tooley, Matthew; Settles, Beverly

    1993-01-01

    Circuits encoding binary telemetry data designed to synthesize any number of selectable codes. Designed for use aboard spacecraft, with features also making them attractive for terrestrial applications: Simple and compact relative to prior coding circuits, built with commercial integrated circuits, and incorporate protective redundancy. Output distortions minimized, and spurious attenuated and/or abbreviated output pulses eliminated.

  9. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  10. Production code control system for hydrodynamics simulations

    SciTech Connect

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration management system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.

  11. Generalized concatenated quantum codes

    SciTech Connect

    Grassl, Markus; Shor, Peter; Smith, Graeme; Smolin, John; Zeng Bei

    2009-05-15

    We discuss the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length but also asymptotically meet the quantum Hamming bound for large block length.

  12. Rotating-Pump Design Code

    NASA Technical Reports Server (NTRS)

    Walker, James F.; Chen, Shu-Cheng; Scheer, Dean D.

    2006-01-01

    Pump Design (PUMPDES) is a computer program for designing a rotating pump for liquid hydrogen, liquid oxygen, liquid nitrogen, water, methane, or ethane. Using realistic properties of these fluids provided by another program called GASPAK, this code performs a station-by-station, mean-line analysis along the pump flow path, obtaining thermodynamic properties of the pumped fluid at each station and evaluating hydraulic losses along the flow path. The variables at each station are obtained under constraints that are consistent with the underlying physical principles. The code evaluates the performance of each stage and the overall pump. In addition, by judiciously choosing the givens and the unknowns, the code can perform a geometric inverse design function: that is, it can compute a pump geometry that yields a closest approximation of given design point. The code contains two major parts: one for an axial-rotor/inducer and one for a multistage centrifugal pump. The inducer and the centrifugal pump are functionally integrated. The code can be used in designing and/or evaluating the inducer/centrifugal-pump combination or the centrifugal pump alone. The code is written in standard Fortran 77.

  13. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  14. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  15. TTS Mapping: integrative WEB tool for analysis of triplex formation target DNA Sequences, G-quadruplets and non-protein coding regulatory DNA elements in the human genome

    PubMed Central

    2009-01-01

    IGF2 gene and bound double-strand nucleic acid TTSs forming natural triplex structures. Conclusion TTS mapping provides comprehensive visual and analytical tools to help users to find pTTSs, G-quadruplets and other regulatory DNA elements in various genome regions. TTS Mapping not only provides sequence visualization and statistical information, but also integrates knowledge about co-localization TTS with various DNA elements and facilitates that data analysis. In particular, TTS Mapping reveals complex structural-functional regulatory module of gene IGF2 including TF MZF1 binding site and ncRNA precursor mir-483 formed by the high-complementary and evolutionarily conserved polypurine- and polypyrimidine-rich DNA pair. Such ncRNAs capable of forming helical triplex structures with a polypurine strand of a nucleic acid duplexes (DNA or RNA) via Hoogsteen or reverse Hoogsteen hydrogen bonds. Our web tool could be used to discover biologically meaningful genome modules and to optimize experimental design of anti-gene treatment. PMID:19958507

  16. TTS mapping: integrative WEB tool for analysis of triplex formation target DNA sequences, G-quadruplets and non-protein coding regulatory DNA elements in the human genome.

    PubMed

    Jenjaroenpun, Piroon; Kuznetsov, Vladimir A

    2009-12-03

    double-strand nucleic acid TTSs forming natural triplex structures. TTS mapping provides comprehensive visual and analytical tools to help users to find pTTSs, G-quadruplets and other regulatory DNA elements in various genome regions. TTS Mapping not only provides sequence visualization and statistical information, but also integrates knowledge about co-localization TTS with various DNA elements and facilitates that data analysis. In particular, TTS Mapping reveals complex structural-functional regulatory module of gene IGF2 including TF MZF1 binding site and ncRNA precursor mir-483 formed by the high-complementary and evolutionarily conserved polypurine- and polypyrimidine-rich DNA pair. Such ncRNAs capable of forming helical triplex structures with a polypurine strand of a nucleic acid duplexes (DNA or RNA) via Hoogsteen or reverse Hoogsteen hydrogen bonds. Our web tool could be used to discover biologically meaningful genome modules and to optimize experimental design of anti-gene treatment.

  17. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  18. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  19. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  20. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  1. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  2. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  3. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  4. FLOWTRAN-TF code benchmarking

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss Of Coolant Accident (LOCA). A description of the code is given by Flach et al. (1990). This report provides benchmarking results for the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit (Smith et al., 1990a; 1990b). Individual constitutive relations are benchmarked in Sections 2 through 5 while in Sections 6 and 7 integral code benchmarking results are presented. An overall assessment of FLOWTRAN-TF for its intended use in computing the ECS power limit completes the document.

  5. Interval coding. II. Dendrite-dependent mechanisms.

    PubMed

    Doiron, Brent; Oswald, Anne-Marie M; Maler, Leonard

    2007-04-01

    The rich temporal structure of neural spike trains provides multiple dimensions to code dynamic stimuli. Popular examples are spike trains from sensory cells where bursts and isolated spikes can serve distinct coding roles. In contrast to analyses of neural coding, the cellular mechanics of burst mechanisms are typically elucidated from the neural response to static input. Bridging the mechanics of bursting with coding of dynamic stimuli is an important step in establishing theories of neural coding. Electrosensory lateral line lobe (ELL) pyramidal neurons respond to static inputs with a complex dendrite-dependent burst mechanism. Here we show that in response to dynamic broadband stimuli, these bursts lack some of the electrophysiological characteristics observed in response to static inputs. A simple leaky integrate-and-fire (LIF)-style model with a dendrite-dependent depolarizing afterpotential (DAP) is sufficient to match both the output statistics and coding performance of experimental spike trains. We use this model to investigate a simplification of interval coding where the burst interspike interval (ISI) codes for the scale of a canonical upstroke rather than a multidimensional stimulus feature. Using this stimulus reduction, we compute a quantization of the burst ISIs and the upstroke scale to show that the mutual information rate of the interval code is maximized at a moderate DAP amplitude. The combination of a reduced description of ELL pyramidal cell bursting and a simplification of the interval code increases the generality of ELL burst codes to other sensory modalities.

  6. A draft model aggregated code of ethics for bioethicists.

    PubMed

    Baker, Robert

    2005-01-01

    Bioethicists function in an environment in which their peers--healthcare executives, lawyers, nurses, physicians--assert the integrity of their fields through codes of professional ethics. Is it time for bioethics to assert its integrity by developing a code of ethics? Answering in the affirmative, this paper lays out a case by reviewing the historical nature and function of professional codes of ethics. Arguing that professional codes are aggregative enterprises growing in response to a field's historical experiences, it asserts that bioethics now needs to assert its integrity and independence and has already developed a body of formal statements that could be aggregated to create a comprehensive code of ethics for bioethics. A Draft Model Aggregated Code of Ethics for Bioethicists is offered in the hope that analysis and criticism of this draft code will promote further discussion of the nature and content of a code of ethics for bioethicists.

  7. Code Seal v. 2.0

    SciTech Connect

    Chavez, Adrian; Solis, John Hector

    2015-04-19

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines, ARM & x86 instruction sets in a mathematically provable way. The technology was developed in order to provide a solution for antireverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code Obfuscation is an active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, temper-protected device, however, Sandia has developed an effective method for obfuscating code.

  8. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  9. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  10. Predictive coding of multisensory timing

    PubMed Central

    Shi, Zhuanghua; Burr, David

    2016-01-01

    The sense of time is foundational for perception and action, yet it frequently departs significantly from physical time. In the paper we review recent progress on temporal contextual effects, multisensory temporal integration, temporal recalibration, and related computational models. We suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in the framework of predictive coding, a framework rooted in Helmholtz’s ‘perception as inference’. PMID:27695705

  11. Driver Code for Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Rao, Shanti

    2007-01-01

    A special-purpose computer code for a deformable-mirror adaptive-optics control system transmits pixel-registered control from (1) a personal computer running software that generates the control data to (2) a circuit board with 128 digital-to-analog converters (DACs) that generate voltages to drive the deformable-mirror actuators. This program reads control-voltage codes from a text file, then sends them, via the computer s parallel port, to a circuit board with four AD5535 (or equivalent) chips. Whereas a similar prior computer program was capable of transmitting data to only one chip at a time, this program can send data to four chips simultaneously. This program is in the form of C-language code that can be compiled and linked into an adaptive-optics software system. The program as supplied includes source code for integration into the adaptive-optics software, documentation, and a component that provides a demonstration of loading DAC codes from a text file. On a standard Windows desktop computer, the software can update 128 channels in 10 ms. On Real-Time Linux with a digital I/O card, the software can update 1024 channels (8 boards in parallel) every 8 ms.

  12. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    1993-07-01

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.

  13. Estimation of 1945 to 1957 food consumption

    SciTech Connect

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    1993-03-01

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.

  14. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project: Draft

    SciTech Connect

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    1993-03-01

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.

  15. Uncertainty and Sensitivity Analyses Plan. Draft for Peer Review: Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  16. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  17. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  18. Integrated Coding and Waveform Design Study.

    DTIC Science & Technology

    1980-08-01

    Ou... Dec.de.,.. ... .1. . 10 0o - 2 - - - 7- - - 0 1 1 CcNLT OS RI ERIFRRINBT(B Fiur 8.......2 ecoded ....... Perfomanc ...f. a Tw...10 A: iiSW fit E~rro" Rate 3: No Inner Coda Co vlutional outer ’ i . . . . . . D ecoder , R-0.5 0 I C: Sinary Inner 0ecode...Convolut~ional Outer Decoder, R-O.25 ’. .l . . .. . . .. . . . . . 0: Channel M easurem ent Tinner Decoder, - 9Con cu citon al Oute r ecode , R

  19. Speech coding research at Bell Laboratories

    NASA Astrophysics Data System (ADS)

    Atal, Bishnu S.

    2004-05-01

    The field of speech coding is now over 70 years old. It started from the desire to transmit voice signals over telegraph cables. The availability of digital computers in the mid 1960s made it possible to test complex speech coding algorithms rapidly. The introduction of linear predictive coding (LPC) started a new era in speech coding. The fundamental philosophy of speech coding went through a major shift, resulting in a new generation of low bit rate speech coders, such as multi-pulse and code-excited LPC. The semiconductor revolution produced faster and faster DSP chips and made linear predictive coding practical. Code-excited LPC has become the method of choice for low bit rate speech coding applications and is used in most voice transmission standards for cell phones. Digital speech communication is rapidly evolving from circuit-switched to packet-switched networks to provide integrated transmission of voice, data, and video signals. The new communication environment is also moving the focus of speech coding research from compression to low cost, reliable, and secure transmission of voice signals on digital networks, and provides the motivation for creating a new class of speech coders suitable for future applications.

  20. Roanoke College Student Conduct Code 1990-91.

    ERIC Educational Resources Information Center

    Roanoke Coll., VA.

    This Roanoke College (Virginia) 1990-91 conduct code manual is intended for distribution to students. A reproduction of the Academic Integrity and Student Conduct Code Form which all students must sign leads off the document. A section detailing the student conduct code explains the delegation of authority within the institution and describes the…

  1. Fast transform decoding of nonsystematic Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Cheung, K.-M.; Reed, I. S.; Shiozaki, A.

    1989-01-01

    A Reed-Solomon (RS) code is considered to be a special case of a redundant residue polynomial (RRP) code, and a fast transform decoding algorithm to correct both errors and erasures is presented. This decoding scheme is an improvement of the decoding algorithm for the RRP code suggested by Shiozaki and Nishida, and can be realized readily on very large scale integration chips.

  2. What is Code Biology?

    PubMed

    Barbieri, Marcello

    2017-10-06

    Various independent discoveries have shown that many organic codes exist in living systems, and this implies that they came into being during the history of life and contributed to that history. The genetic code appeared in a population of primitive systems that has been referred to as the common ancestor, and it has been proposed that three distinct signal processing codes gave origin to the three primary kingdoms of Archaea, Bacteria and Eukarya. After the genetic code and the signal processing codes, on the other hand, only the ancestors of the eukaryotes continued to explore the coding space and gave origin to splicing codes, histone code, tubulin code, compartment codes and many others. A first theoretical consequence of this historical fact is the idea that the Eukarya became increasingly more complex because they maintained the potential to bring new organic codes into existence. A second theoretical consequence comes from the fact that the evolution of the individual rules of a code can take an extremely long time, but the origin of a new organic code corresponds to the appearance of a complete set of rules and from a geological point of view this amounts to a sudden event. The great discontinuities of the history of life, in other words, can be explained as the result of the appearance of new codes. A third theoretical consequence comes from the fact that the organic codes have been highly conserved in evolution, which shows that they are the great invariants of life, the sole entities that have gone intact through billions of years while everything else has changed. This tells us that the organic codes are fundamental components of life and their study - the new research field of Code Biology - is destined to become an increasingly relevant part of the life sciences. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  4. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  5. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  6. Cellulases and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  7. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  8. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  9. Practices in Code Discoverability

    NASA Astrophysics Data System (ADS)

    Teuben, P.; Allen, A.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Much of scientific progress now hinges on the reliability, falsifiability and reproducibility of computer source codes. Astrophysics in particular is a discipline that today leads other sciences in making useful scientific components freely available online, including data, abstracts, preprints, and fully published papers, yet even today many astrophysics source codes remain hidden from public view. We review the importance and history of source codes in astrophysics and previous efforts to develop ways in which information about astrophysics codes can be shared. We also discuss why some scientist coders resist sharing or publishing their codes, the reasons for and importance of overcoming this resistance, and alert the community to a reworking of one of the first attempts for sharing codes, the Astrophysics Source Code Library (ASCL). We discuss the implementation of the ASCL in an accompanying poster paper. We suggest that code could be given a similar level of referencing as data gets in repositories such as ADS.

  10. The optimal code searching method with an improved criterion of coded exposure for remote sensing image restoration

    NASA Astrophysics Data System (ADS)

    He, Lirong; Cui, Guangmang; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2015-03-01

    Coded exposure photography makes the motion de-blurring a well-posed problem. The integration pattern of light is modulated using the method of coded exposure by opening and closing the shutter within the exposure time, changing the traditional shutter frequency spectrum into a wider frequency band in order to preserve more image information in frequency domain. The searching method of optimal code is significant for coded exposure. In this paper, an improved criterion of the optimal code searching is proposed by analyzing relationship between code length and the number of ones in the code, considering the noise effect on code selection with the affine noise model. Then the optimal code is obtained utilizing the method of genetic searching algorithm based on the proposed selection criterion. Experimental results show that the time consuming of searching optimal code decreases with the presented method. The restoration image is obtained with better subjective experience and superior objective evaluation values.

  11. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  12. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  13. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  14. Morse Code Activity Packet.

    ERIC Educational Resources Information Center

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  15. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  16. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  17. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  18. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  19. Scholarly Integrity.

    PubMed

    Francisco, Joseph S; Hahn, Ulrike; Schwarz, Helmut

    2017-04-03

    "… Scholarly integrity is not only the foundational bedrock of scientific inquiry, it is also the prerequisite for a positive image of scholarship … For individuals, integrity is an aspect of moral character and experience. For institutions, it is about creating an environment that promotes responsible conduct … In the first instance, research institutions must provide guidelines and codes of practice on scholarly integrity …" Read more in the Editorial by J. S. Francisco, U. Hahn, and H. Schwarz. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  1. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  2. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  3. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  4. Code of ethics for dental researchers.

    PubMed

    2014-01-01

    The International Association for Dental Research, in 2009, adopted a code of ethics. The code applies to members of the association and is enforceable by sanction, with the stated requirement that members are expected to inform the association in cases where they believe misconduct has occurred. The IADR code goes beyond the Belmont and Helsinki statements by virtue of covering animal research. It also addresses issues of sponsorship of research and conflicts of interest, international collaborative research, duty of researchers to be informed about applicable norms, standards of publication (including plagiarism), and the obligation of "whistleblowing" for the sake of maintaining the integrity of the dental research enterprise as a whole. The code is organized, like the ADA code, into two sections. The IADR principles are stated, but not defined, and number 12, instead of the ADA's five. The second section consists of "best practices," which are specific statements of expected or interdicted activities. The short list of definitions is useful.

  5. MHDust: A 3-fluid dusty plasma code

    NASA Astrophysics Data System (ADS)

    Lazerson, Samuel

    MHDust is a next generation 3-fluid magnetized dusty plasma code, treating the inertial dynamics of both the dust and ion components. Coded in ANSI C, the numerical method employs Leap-Frog and Dufort-Frankel integration schemes. Features include: nonlinear collisional terms, quasi-neutrality or continuity based electron densities, and dynamical dust charge number. Tests of wave-mode propagation (Acoustic and Electromagnetic) allow a comparison to linear wave mode theory. Additional nonlinear phenomena are presented including magnetic reconnection and shear-flow instabilities. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (DENISIS). The utility of the code is expanded through the possibility of small dust mass. This allows MH- Dust to be used as a 2-ion plasma code. MHDust considerably expands the range of numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

  6. CATHARE code development and assessment methodologies

    SciTech Connect

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-12-31

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l`Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation.

  7. GeoPhysical Analysis Code

    SciTech Connect

    2011-05-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-Written components: (i) a set of standard finite elements physics solvers for rersolving Darcy fluid flow, explicit mechanics, implicit mechanics, and fluid-mediated fracturing, including resolution of contact both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems and problems involving hydraulic fracturing, where the mesh topology is dynamically changed. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for, e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release.

  8. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  9. DGPS ground station integrity monitoring

    NASA Technical Reports Server (NTRS)

    Skidmore, Trent A.; Vangraas, Frank

    1995-01-01

    This paper summarizes the development of a unique Differential Global Positioning System (DGPS) ground station integrity monitor which can offer improved availability over conventional code-differential monitoring systems. This monitoring technique, called code/carrier integrity monitoring (CCIM), uses the highly stable integrated Doppler measurement to smooth the relatively noisy code-phase measurements. The pseudorange correction is therefore comprised of the integrated Doppler measurement plus the CCIM offset. The design and operational results of a DGPS ground station integrity monitor are reported. A robust integrity monitor is realized which is optimized for applications such as the Special Category I (SCAT-I) defined in the RTCA Minimum Aviation System Performance Standards.

  10. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  11. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  12. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  13. Breaking the Neural Code

    DTIC Science & Technology

    2015-05-21

    SECURITY CLASSIFICATION OF: This seedling proposed to use advanced imaging techniques to break the neuronal code that links the firing of neurons in...Report: Breaking the Neural Code Report Title This seedling proposed to use advanced imaging techniques to break the neuronal code that links the...generating a closed-loop on-line experimental platform. We have completed all proposed tasks of the seedling and successfully completed preliminary

  14. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  15. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  16. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    Ptolemy Coding Style Christopher Brooks Edward A. Lee Electrical Engineering and Computer Sciences University of California at Berkeley Technical...COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...constraints, so such constraints are not new to the academic community. This document describes the coding style used in Ptolemy II, a package with

  17. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  18. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  19. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  20. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  1. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  2. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  3. GUIS for scientific code usage

    NASA Astrophysics Data System (ADS)

    Dionne, N.

    1993-12-01

    To achieve high-level functionality, an unadomed GUI based upon an enhanced version of the MIT XII graphic routines has been integrated into SAIC's MASK code for keystroke-controlled, fully-interactive scientific application. Featured run-time capabilities include: a) buffered plot animation, b) mouse-driven data extraction, c) menu-driven parameter editing, d) postscript-based hard copy prints, e) run-state save, f) numerous plot display selections, and g) optional GUI exit/return. A 400-line fortran-to-X-library interface (written in C) lies at the core of this utility, permitting either serial or concurrent interfacial keystroke control.

  4. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  5. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  6. Insurance billing and coding.

    PubMed

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms.

  7. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  8. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  9. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  10. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  11. Modified JPEG Huffman coding.

    PubMed

    Lakhani, Gopal

    2003-01-01

    It is a well observed characteristic that when a DCT block is traversed in the zigzag order, the AC coefficients generally decrease in size and the run-length of zero coefficients increase in number. This article presents a minor modification to the Huffman coding of the JPEG baseline compression algorithm to exploit this redundancy. For this purpose, DCT blocks are divided into bands so that each band can be coded using a separate code table. Three implementations are presented, which all move the end-of-block marker up in the middle of DCT block and use it to indicate the band boundaries. Experimental results are presented to compare reduction in the code size obtained by our methods with the JPEG sequential-mode Huffman coding and arithmetic coding methods. The average code reduction to the total image code size of one of our methods is 4%. Our methods can also be used for progressive image transmission and hence, experimental results are also given to compare them with two-, three-, and four-band implementations of the JPEG spectral selection method.

  12. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  13. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  14. Lichenase and coding sequences

    SciTech Connect

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  15. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  16. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  17. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  18. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  19. Recent developments in the Los Alamos radiation transport code system

    SciTech Connect

    Forster, R.A.; Parsons, K.

    1997-06-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.

  20. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  1. Patient misidentifications caused by errors in standard bar code technology.

    PubMed

    Snyder, Marion L; Carter, Alexis; Jenkins, Karen; Fantz, Corinne R

    2010-10-01

    Bar code technology has decreased transcription errors in many healthcare applications. However, we have found that linear bar code identification methods are not failsafe. In this study, we sought to identify the sources of bar code decoding errors that generated incorrect patient identifiers when bar codes were scanned for point-of-care glucose testing and to develop solutions to prevent their occurrence. We identified misread wristband bar codes, removed them from service, and rescanned them by using 5 different scanner models. Bar codes were reprinted in pristine condition for use as controls. We determined error rates for each bar code-scanner pair and manually calculated internal bar code data integrity checks. As many as 3 incorrect patient identifiers were generated from a single bar code. Minor bar code imperfections, failure to control for bar code scanner resolution requirements, and less than optimal printed bar code orientation were confirmed as sources of these errors. Of the scanner models tested, the Roche ACCU-CHEK® glucometer had the highest error rate. The internal data integrity check system did not detect these errors. Bar code-related patient misidentifications can occur. In the worst case, misidentified patient results could have been transmitted to the incorrect patient medical record. This report has profound implications not only for point-of-care testing but also for bar coded medication administration, transfusion recipient certification systems, and other areas where patient misidentifications can be life-threatening. Careful control of bar code scanning and printing equipment specifications will minimize this threat to patient safety. Ultimately, healthcare device manufacturers should adopt more robust and higher fidelity alternatives to linear bar code symbologies.

  2. An object-oriented electromagnetic PIC code

    NASA Astrophysics Data System (ADS)

    Verboncoeur, J. P.; Langdon, A. B.; Gladd, N. T.

    1995-05-01

    The object-oriented paradigm provides an opportunity for advanced PIC modeling, increased flexibility, and extensibility. Particle-in-cell codes for simulating plasmas are traditionally written in structured FORTRAN or C. This has resulted in large legacy codes which are difficult to maintain and extend with new models. In this ongoing research, we apply the object-oriented design technique to address these issues. The resulting code architecture, OOPIC (object-oriented particle-in-cell), is a two-dimensional relativistic electromagnetic PIC code. The object-oriented implementation of the algorithms is described, including an integral-form field solve, and a piecewise current deposition and particle position update. The architecture encapsulates key PIC algorithms and data into objects, simplifying extensions such as new boundary conditions and field algorithms.

  3. FERRET adjustment code: status/use

    SciTech Connect

    Schmittroth, F.A.

    1982-03-01

    The least-squares data analysis code FERRET is reviewed. Recent enhancements are discussed along with illustrative applications. Particular features noted include the use of differential as well as integral data, and additional user options for assigning and storing covariance matrices.

  4. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    NASA Astrophysics Data System (ADS)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-04-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  5. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  6. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  7. Superluminal Labview Code

    SciTech Connect

    Wheat, Robert; Marksteiner, Quinn; Quenzer, Jonathan; Higginson, Ian

    2012-03-26

    This labview code is used to set the phase and amplitudes on the 72 antenna of the superluminal machine, and to map out the radiation patter from the superluminal antenna.Each antenna radiates a modulated signal consisting of two separate frequencies, in the range of 2 GHz to 2.8 GHz. The phases and amplitudes from each antenna are controlled by a pair of AD8349 vector modulators (VMs). These VMs set the phase and amplitude of a high frequency signal using a set of four DC inputs, which are controlled by Linear Technologies LTC1990 digital to analog converters (DACs). The labview code controls these DACs through an 8051 microcontroller.This code also monitors the phases and amplitudes of the 72 channels. Near each antenna, there is a coupler that channels a portion of the power into a binary network. Through a labview controlled switching array, any of the 72 coupled signals can be channeled in to the Tektronix TDS 7404 digital oscilloscope. Then the labview code takes an FFT of the signal, and compares it to the FFT of a reference signal in the oscilloscope to determine the magnitude and phase of each sideband of the signal. The code compensates for phase and amplitude errors introduced by differences in cable lengths.The labview code sets each of the 72 elements to a user determined phase and amplitude. For each element, the code runs an iterative procedure, where it adjusts the DACs until the correct phases and amplitudes have been reached.

  8. Parallel CARLOS-3D code development

    SciTech Connect

    Putnam, J.M.; Kotulski, J.D.

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions to the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.

  9. Coding for surgical audit.

    PubMed

    Pettigrew, R A; van Rij, A M

    1990-05-01

    A simple system of codes for operations, diagnoses and complications, developed specifically for computerized surgical audit, is described. This arose following a review of our established surgical audit in which problems in the retrieval of data from the database were identified. Evaluation of current methods of classification of surgical data highlighted the need for a dedicated coding system that was suitable for classifying surgical audit data, enabling rapid retrieval from large databases. After 2 years of use, the coding system has been found to fulfil the criteria of being sufficiently flexible and specific for computerized surgical audit, yet simple enough for medical staff to use.

  10. SASSYS LMFBR systems code

    SciTech Connect

    Dunn, F.E.; Prohammer, F.G.; Weber, D.P.

    1983-01-01

    The SASSYS LMFBR systems analysis code is being developed mainly to analyze the behavior of the shut-down heat-removal system and the consequences of failures in the system, although it is also capable of analyzing a wide range of transients, from mild operational transients through more severe transients leading to sodium boiling in the core and possible melting of clad and fuel. The code includes a detailed SAS4A multi-channel core treatment plus a general thermal-hydraulic treatment of the primary and intermediate heat-transport loops and the steam generators. The code can handle any LMFBR design, loop or pool, with an arbitrary arrangement of components. The code is fast running: usually faster than real time.

  11. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  12. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  13. Critical Care Coding for Neurologists.

    PubMed

    Nuwer, Marc R; Vespa, Paul M

    2015-10-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  14. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  15. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  16. Scalable motion vector coding

    NASA Astrophysics Data System (ADS)

    Barbarien, Joeri; Munteanu, Adrian; Verdicchio, Fabio; Andreopoulos, Yiannis; Cornelis, Jan P.; Schelkens, Peter

    2004-11-01

    Modern video coding applications require transmission of video data over variable-bandwidth channels to a variety of terminals with different screen resolutions and available computational power. Scalable video coding is needed to optimally support these applications. Recently proposed wavelet-based video codecs employing spatial domain motion compensated temporal filtering (SDMCTF) provide quality, resolution and frame-rate scalability while delivering compression performance comparable to that of the state-of-the-art non-scalable H.264-codec. These codecs require scalable coding of the motion vectors in order to support a large range of bit-rates with optimal compression efficiency. Scalable motion vector coding algorithms based on the integer wavelet transform followed by embedded coding of the wavelet coefficients were recently proposed. In this paper, a new and fundamentally different scalable motion vector codec (MVC) using median-based motion vector prediction is proposed. Extensive experimental results demonstrate that the proposed MVC systematically outperforms the wavelet-based state-of-the-art solutions. To be able to take advantage of the proposed scalable MVC, a rate allocation mechanism capable of optimally dividing the available rate among texture and motion information is required. Two rate allocation strategies are proposed and compared. The proposed MVC and rate allocation schemes are incorporated into an SDMCTF-based video codec and the benefits of scalable motion vector coding are experimentally demonstrated.

  17. A review of wind field models for atmospheric transport

    SciTech Connect

    Ramsdell, J.V. Jr.; Skyllingstad, E.D.

    1993-06-01

    The primary objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. The HEDR Project is developing a computer code to estimate these doses and their uncertainties. The code, known as the HEDR integrated Code (HEDRIC), consists of four separate component codes. One of the component codes, called the Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET) combines meteorological and release data to estimate time-integrated air concentrations and surface contamination at specific locations in the vicinity of the Hanford Site. The RATCHET domain covers approximately 75,000 square miles, extending from the crest of the Cascade Mountains on the west to the eastern edge of the Idaho panhandle and from central Oregon on the south to the Canadian border. This letter report explains the procedures in RATCHET that transform observed wind data into the wind fields used in atmospheric transport calculations. It also describes and evaluates alternative procedures not selected for use in RATCHET.

  18. Code Seal v 1.0

    SciTech Connect

    Chavez, Adrian; & Anderson, William

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is an active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation

  19. RAMONA-4B code for BWR systems analysis

    SciTech Connect

    Cheng, H.S.; Rohatgi, U.S.

    1996-12-31

    The RAMONA-4B code is a coupled thermal-hydraulic, 3D kinetics code for plant transient analyses of a complete Boiling Water Reactor (BWR) system including Reactor Pressure Vessel (RPV), Balance of Plant (BOP) and containment. The complete system representation enables an integrated and coupled systems analysis of a BWR without recourse to prescribed boundary conditions.

  20. 48 CFR 1801.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 6 2013-10-01 2013-10-01 false Publication and code arrangement. 1801.105-1 Section 1801.105-1 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND....105-1 Publication and code arrangement. (b)(i) The NFS is an integrated document that contains both...

  1. 48 CFR 1801.105-1 - Publication and code arrangement.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 6 2011-10-01 2011-10-01 false Publication and code arrangement. 1801.105-1 Section 1801.105-1 Federal Acquisition Regulations System NATIONAL AERONAUTICS AND....105-1 Publication and code arrangement. (b)(i) The NFS is an integrated document that contains both...

  2. Telemetry advances in data compression and channel coding

    NASA Technical Reports Server (NTRS)

    Miller, Warner H.; Morakis, James C.; Yeh, Pen-Shu

    1990-01-01

    Addressed in this paper is the dependence of telecommunication channel, forward error correcting coding and source data compression coding on integrated circuit technology. Emphasis is placed on real time high speed Reed Solomon (RS) decoding using full custom VLSI technology. Performance curves of NASA's standard channel coder and a proposed standard lossless data compression coder are presented.

  3. Hominoid-Specific De Novo Protein-Coding Genes Originating from Long Non-Coding RNAs

    PubMed Central

    Liu, Chu-Jun; Zhou, Wei-Zhen; Li, Ying; Zhang, Mao; Zhang, Rongli; Wei, Liping; Li, Chuan-Yun

    2012-01-01

    Tinkering with pre-existing genes has long been known as a major way to create new genes. Recently, however, motherless protein-coding genes have been found to have emerged de novo from ancestral non-coding DNAs. How these genes originated is not well addressed to date. Here we identified 24 hominoid-specific de novo protein-coding genes with precise origination timing in vertebrate phylogeny. Strand-specific RNA–Seq analyses were performed in five rhesus macaque tissues (liver, prefrontal cortex, skeletal muscle, adipose, and testis), which were then integrated with public transcriptome data from human, chimpanzee, and rhesus macaque. On the basis of comparing the RNA expression profiles in the three species, we found that most of the hominoid-specific de novo protein-coding genes encoded polyadenylated non-coding RNAs in rhesus macaque or chimpanzee with a similar transcript structure and correlated tissue expression profile. According to the rule of parsimony, the majority of these hominoid-specific de novo protein-coding genes appear to have acquired a regulated transcript structure and expression profile before acquiring coding potential. Interestingly, although the expression profile was largely correlated, the coding genes in human often showed higher transcriptional abundance than their non-coding counterparts in rhesus macaque. The major findings we report in this manuscript are robust and insensitive to the parameters used in the identification and analysis of de novo genes. Our results suggest that at least a portion of long non-coding RNAs, especially those with active and regulated transcription, may serve as a birth pool for protein-coding genes, which are then further optimized at the transcriptional level. PMID:23028352

  4. Upgrades to the WIMS-ANL code.

    SciTech Connect

    Woodruff, W. L.

    1998-10-14

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries.

  5. Boundary-Layer Code For Supersonic Combustion

    NASA Technical Reports Server (NTRS)

    Pinckney, S. Z.; Walton, J. T.

    1994-01-01

    HUD is integral computer code based on Spaulding-Chi method for predicting development of boundary layers in laminar, transitional, and turbulent regions of flows on two-dimensional or axisymmetric bodies. Approximates nonequilibrium velocity profiles as well as local surface friction in presence of pressure gradient. Predicts transfer of heat in turbulent boundary layer in presence of high axial presure gradient. Provides for pressure gradients both normal and lateral to surfaces. Also used to estimate requirements for cooling scramjet engines. Because of this capability, HUD program incorporated into several scramjet-cycle-performance-analysis codes, including SCRAM (ARC-12338) and SRGULL (LEW-15093). Written in FORTRAN 77.

  6. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  7. Network predicting drug's anatomical therapeutic chemical code.

    PubMed

    Wang, Yong-Cui; Chen, Shi-Long; Deng, Nai-Yang; Wang, Yong

    2013-05-15

    Discovering drug's Anatomical Therapeutic Chemical (ATC) classification rules at molecular level is of vital importance to understand a vast majority of drugs action. However, few studies attempt to annotate drug's potential ATC-codes by computational approaches. Here, we introduce drug-target network to computationally predict drug's ATC-codes and propose a novel method named NetPredATC. Starting from the assumption that drugs with similar chemical structures or target proteins share common ATC-codes, our method, NetPredATC, aims to assign drug's potential ATC-codes by integrating chemical structures and target proteins. Specifically, we first construct a gold-standard positive dataset from drugs' ATC-code annotation databases. Then we characterize ATC-code and drug by their similarity profiles and define kernel function to correlate them. Finally, we use a kernel method, support vector machine, to automatically predict drug's ATC-codes. Our method was validated on four drug datasets with various target proteins, including enzymes, ion channels, G-protein couple receptors and nuclear receptors. We found that both drug's chemical structure and target protein are predictive, and target protein information has better accuracy. Further integrating these two data sources revealed more experimentally validated ATC-codes for drugs. We extensively compared our NetPredATC with SuperPred, which is a chemical similarity-only based method. Experimental results showed that our NetPredATC outperforms SuperPred not only in predictive coverage but also in accuracy. In addition, database search and functional annotation analysis support that our novel predictions are worthy of future experimental validation. In conclusion, our new method, NetPredATC, can predict drug's ATC-codes more accurately by incorporating drug-target network and integrating data, which will promote drug mechanism understanding and drug repositioning and discovery. NetPredATC is available at http

  8. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  9. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  10. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  11. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  12. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  13. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  14. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  15. GeoPhysical Analysis Code

    SciTech Connect

    2012-12-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-written components: (i) a set of standard finite, discrete, and discontinuous displacement element physics solvers for resolving Darcy fluid flow, explicit mechanics, implicit mechanics, fault rupture and earthquake nucleation, and fluid-mediated fracturing, including resolution of physcial behaviors both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems; ploblems involving hydraulic fracturing, where the mesh topology is dynamically changed; fault rupture modeling and seismic risk assessment; and general granular materials behavior. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for , e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release. CPAC's secondary applications include modeling fault evolution for predicting the statistical distribution of earthquake events and to capture granular materials behavior under different load paths.

  16. The Clawpack Community of Codes

    NASA Astrophysics Data System (ADS)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  17. NSCool: Neutron star cooling code

    NASA Astrophysics Data System (ADS)

    Page, Dany

    2016-09-01

    NSCool is a 1D (i.e., spherically symmetric) neutron star cooling code written in Fortran 77. The package also contains a series of EOSs (equation of state) to build stars, a series of pre-built stars, and a TOV (Tolman- Oppenheimer-Volkoff) integrator to build stars from an EOS. It can also handle “strange stars” that have a huge density discontinuity between the quark matter and the covering thin baryonic crust. NSCool solves the heat transport and energy balance equations in whole GR, resulting in a time sequence of temperature profiles (and, in particular, a Teff - age curve). Several heating processes are included, and more can easily be incorporated. In particular it can evolve a star undergoing accretion with the resulting deep crustal heating, under a steady or time-variable accretion rate. NSCool is robust, very fast, and highly modular, making it easy to add new subroutines for new processes.

  18. Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Digital Systems Technology Branch has an ongoing program in modulation, coding, onboard processing, and switching. Recently, NASA completed a project to incorporate a time-shared decoder into the very-small-aperture terminal (VSAT) onboard-processing mesh architecture. The primary goal was to demonstrate a time-shared decoder for a regenerative satellite that uses asynchronous, frequency-division multiple access (FDMA) uplink channels, thereby identifying hardware and power requirements and fault-tolerant issues that would have to be addressed in a operational system. A secondary goal was to integrate and test, in a system environment, two NASA-sponsored, proof-of-concept hardware deliverables: the Harris Corp. high-speed Bose Chaudhuri-Hocquenghem (BCH) codec and the TRW multichannel demultiplexer/demodulator (MCDD). A beneficial byproduct of this project was the development of flexible, multichannel-uplink signal-generation equipment.

  19. Noiseless coding for the magnetometer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1987-01-01

    Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

  20. C++ Coding Standards for the AMP Project

    SciTech Connect

    Evans, Thomas M; Clarno, Kevin T

    2009-09-01

    This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that is written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].

  1. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  2. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  3. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  4. Polar Code Validation

    DTIC Science & Technology

    1989-09-30

    Unclassified 2a SECURITY CLASSiF-ICATiON AUTHORIT’Y 3 DIStRIBUTION AVAILABILITY OF REPORT N,A Approved for public release; 2o DECLASSIFICAIiON DOWNGRADING SCH DI...SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3 . POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL-ASMA Su ^"ru5 I1LS SH A...11 Structure of the Bipolar Plasma Sheath Generated by SPEAR I ... ...... 1 3 The POLAR Code Wake Model: Comparison with in Situ Observations . . 23

  5. Parallelized tree-code for clusters of personal computers

    NASA Astrophysics Data System (ADS)

    Viturro, H. R.; Carpintero, D. D.

    2000-02-01

    We present a tree-code for integrating the equations of the motion of collisionless systems, which has been fully parallelized and adapted to run in several PC-based processors simultaneously, using the well-known PVM message passing library software. SPH algorithms, not yet included, may be easily incorporated to the code. The code is written in ANSI C; it can be freely downloaded from a public ftp site. Simulations of collisions of galaxies are presented, with which the performance of the code is tested.

  6. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, S.

    1984-01-01

    Several error control coding techniques for reliable satellite communications were investigated to find algorithms for fast decoding of Reed-Solomon codes in terms of dual basis. The decoding of the (255,223) Reed-Solomon code, which is used as the outer code in the concatenated TDRSS decoder, was of particular concern.

  7. Criticality Code Validation Exercises with TSUNAMI

    SciTech Connect

    Rearden, Bradley T

    2007-01-01

    In the criticality code validation of common systems, many paths may exist to a correct bias, bias uncertainty, and upper subcritical limit. The challenge for the criticality analyst is to select an efficient, defensible, and safe methodology to consistently obtain the correct values. One method of testing criticality code validation techniques is to use a sample system with a known bias as a test application and determine whether the methods employed can reproduce the known bias. In this paper, a low-enriched uranium (LEU) lattice critical experiment with a known bias is used as the test application, and numerous other LEU experiments are used as the benchmarks for the criticality code validation exercises using traditional and advanced parametric techniques. The parameters explored are enrichment, energy of average lethargy causing fission (EALF), and the TSUNAMI integral index ck with experiments with varying degrees of similarity. This paper is an extension of a previously published summary.

  8. Sandia National Laboratories analysis code data base

    SciTech Connect

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  9. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  10. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  11. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  12. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  13. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  14. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  15. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  16. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  17. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  18. Dual Coding in Children.

    ERIC Educational Resources Information Center

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  19. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff.

  20. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  1. Validation of the G-PASS code : status report.

    SciTech Connect

    Vilim, R. B.; Nuclear Engineering Division

    2009-03-12

    Validation is the process of determining whether the models in a computer code can describe the important phenomena in applications of interest. This report describes past work and proposed future work for validating the Gas Plant Analyzer and System Simulator (G-PASS) code. The G-PASS code was developed for simulating gas reactor and chemical plant system behavior during operational transients and upset events. Results are presented comparing code properties, individual component models, and integrated system behavior against results from four other computer codes. Also identified are two experiment facilities nearing completion that will provide additional data for individual component and integrated system model validation. The main goal of the validation exercise is to ready a version of G-PASS for use as a tool in evaluating vendor designs and providing guidance to vendors on design directions in nuclear-hydrogen applications.

  2. Multifractal detrended cross-correlation analysis of coding and non-coding DNA sequences through chaos-game representation

    NASA Astrophysics Data System (ADS)

    Pal, Mayukha; Satish, B.; Srinivas, K.; Rao, P. Madhusudana; Manimaran, P.

    2015-10-01

    We propose a new approach combining the chaos game representation and the two dimensional multifractal detrended cross correlation analysis methods to examine multifractal behavior in power law cross correlation between any pair of nucleotide sequences of unequal lengths. In this work, we analyzed the characteristic behavior of coding and non-coding DNA sequences of eight prokaryotes. The results show the presence of strong multifractal nature between coding and non-coding sequences of all data sets. We found that this integrative approach helps us to consider complete DNA sequences for characterization, and further it may be useful for classification, clustering, identification of class affiliation of nucleotide sequences etc. with high precision.

  3. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  4. Fundamentals of coding and reimbursement.

    PubMed

    Price, Paula

    2002-01-01

    After completing this introduction to radiology coding and reimbursement, readers will: Understand how health care reimbursement evolved over the past 50 years. Know the importance of documenting the patient's history. Have an overall picture of the standardized numerical coding system. Understand how accurate coding affects reimbursement. Understand coding functions as they pertain to regulatory compliance in the radiology department. Be familiar with the U.S. Justice Department's use of coding in tracking health care fraud.

  5. High Dimensional Trellis Coded Modulation

    DTIC Science & Technology

    2002-03-01

    popular recently for the decoding of turbo codes (or parallel concatenated codes ) which require an iteration between two permuted code sequences. The...nonsystematic constituent codes ) Published descriptions of the implementation of turbo decoders refer to the permuted “common” or “extrinsic” information...invented based on that condition. With the recent development of turbo codes [4] and the requirement of short frame transmission [5] [6], trellis

  6. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  7. Free electron laser physical process code (FELPPC)

    SciTech Connect

    Thode, L.E.; Chan, K.C.D.; Schmitt, M.J.

    1995-02-01

    Even at the conceptual level, the strong coupling between subsystem elements complicates the understanding and design of a free electron laser (FEL). Given the requirements for high-performance FELS, the coupling between subsystems must be included to obtain a realistic picture of the potential operational capability. The concept of an Integrated Numerical Experiment (INEX) was implemented to accurately calculate the coupling between the FEL subsystems. During the late 1980`s, the INEX approach was successfully applied to a large number of accelerator and FEL experiments. Unfortunately, because of significant manpower and computational requirements, the integrated approach is difficult to apply to trade-off and initial design studies. However, the INEX codes provided a base from which realistic accelerator, wiggler, optics, and control models could be developed. The Free Electron Laser Physical Process Code (FELPPC) includes models developed from the INEX codes, provides coupling between the subsystem models, and incorporates application models relevant to a specific study. In other words, FELPPC solves the complete physical process model using realistic physics and technology constraints. FELPPC can calculate complex FEL configurations including multiple accelerator and wiggler combinations. When compared with the INEX codes, the subsystem models have been found to be quite accurate over many orders-of-magnitude. As a result, FELPPC has been used for the initial design studies of a large number of FEL applications: high-average-power ground, space, plane, and ship based FELS; beacon and illuminator FELS; medical and compact FELS; and XUV FELS.

  8. LEGO: A Modular Accelerator Design Code

    NASA Astrophysics Data System (ADS)

    Cai, Y.; Irwin, J.

    1997-05-01

    An object-oriented accelerator design code has been designed and implemented in a simple and modular fashion. It contains all major features of its predecessors TRACY and DESPOT. All physics of single-particle dynamics is implemented based on the Hamiltonian in the local frame of the component. Components can be moved arbitrarily in three dimensional space. Several symplectic integrators are used to approximate the integration of the local Hamiltonians. A differential algebra class is introduced to extract a Taylor map up to an arbitrary order. Analysis of optics is done in the same way for both the linear and non-linear cases. Currently the code is used to design and simulate the lattices of the PEP-II. It will be used for the commissioning of the machines as well.

  9. Cracking the code of change.

    PubMed

    Beer, M; Nohria, N

    2000-01-01

    Today's fast-paced economy demands that businesses change or die. But few companies manage corporate transformations as well as they would like. The brutal fact is that about 70% of all change initiatives fail. In this article, authors Michael Beer and Nitin Nohria describe two archetypes--or theories--of corporate transformation that may help executives crack the code of change. Theory E is change based on economic value: shareholder value is the only legitimate measure of success, and change often involves heavy use of economic incentives, layoffs, downsizing, and restructuring. Theory O is change based on organizational capability: the goal is to build and strengthen corporate culture. Most companies focus purely on one theory or the other, or haphazardly use a mix of both, the authors say. Combining E and O is directionally correct, they contend, but it requires a careful, conscious integration plan. Beer and Nohria present the examples of two companies, Scott Paper and Champion International, that used a purely E or purely O strategy to create change--and met with limited levels of success. They contrast those corporate transformations with that of UK-based retailer ASDA, which has successfully embraced the paradox between the opposing theories of change and integrated E and O. The lesson from ASDA? To thrive and adapt in the new economy, companies must make sure the E and O theories of business change are in sync at their own organizations.

  10. Frequency-coded quantum key distribution.

    PubMed

    Bloch, Matthieu; McLaughlin, Steven W; Merolla, Jean-Marc; Patois, Frédéric

    2007-02-01

    We report an intrinsically stable quantum key distribution scheme based on genuine frequency-coded quantum states. The qubits are efficiently processed without fiber interferometers by fully exploiting the nonlinear interaction occurring in electro-optic phase modulators. The system requires only integrated off-the-shelf devices and could be used with a true single-photon source. Preliminary experiments have been performed with weak laser pulses and have demonstrated the feasibility of this new setup.

  11. Quantum codes from linear codes over finite chain rings

    NASA Astrophysics Data System (ADS)

    Liu, Xiusheng; Liu, Hualu

    2017-10-01

    In this paper, we provide two methods of constructing quantum codes from linear codes over finite chain rings. The first one is derived from the Calderbank-Shor-Steane (CSS) construction applied to self-dual codes over finite chain rings. The second construction is derived from the CSS construction applied to Gray images of the linear codes over finite chain ring F_{p^{2m}}+u{F}_{p^{2m}}. The good parameters of quantum codes from cyclic codes over finite chain rings are obtained.

  12. Acceleration of a Monte Carlo radiation transport code

    SciTech Connect

    Hochstedler, R.D.; Smith, L.M.

    1996-03-01

    Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. {copyright} {ital 1996 American Institute of Physics.}

  13. The Nursing Code of Ethics: Its Value, Its History.

    PubMed

    Epstein, Beth; Turner, Martha

    2015-05-31

    To practice competently and with integrity, today's nurses must have in place several key elements that guide the profession, such as an accreditation process for education, a rigorous system for licensure and certification, and a relevant code of ethics. The American Nurses Association has guided and supported nursing practice through creation and implementation of a nationally accepted Code of Ethics for Nurses with Interpretive Statements. This article will discuss ethics in society, professions, and nursing and illustrate how a professional code of ethics can guide nursing practice in a variety of settings. We also offer a brief history of the Code of Ethics, discuss the modern Code of Ethics, and describe the importance of periodic revision, including the inclusive and thorough process used to develop the 2015 Code and a summary of recent changes. Finally, the article provides implications for practicing nurses to assure that this document is a dynamic, useful resource in a variety of healthcare settings.

  14. A coded modulation design for the INMARSAT geostationary GLONASS augmentation

    NASA Astrophysics Data System (ADS)

    Stein, B.; Tsang, W.

    A cold modulation scheme is proposed to carryout the Global Navigation Satellite System (GLONASS) geostationary augmentation which includes both integrity and navigation functions over the next generation International Maritime Satellite Organization (INMARSAT) satellites. A baseline coded modulation scheme for the GLONASS augmentation broadcast proposes a forward error correction code over a differential phase shift keying (DPSK) modulation. The use of a concatenated code over the same signaling is considered. The proposed coded modulation design is more powerful and robust, yet not overly more complex in system implementation than the baseline scheme. Performance results of concatenated codes over a DPSK signaling used in the design are presented. The sensitivity analysis methodology in selecting the coded modulation scheme is also discussed.

  15. Computing Challenges in Coded Mask Imaging

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  16. FESDIF -- Finite Element Scalar Diffraction theory code

    SciTech Connect

    Kraus, H.G.

    1992-09-01

    This document describes the theory and use of a powerful scalar diffraction theory based computer code for calculation of intensity fields due to diffraction of optical waves by two-dimensional planar apertures and lenses. This code is called FESDIF (Finite Element Scalar Diffraction). It is based upon both Fraunhofer and Kirchhoff scalar diffraction theories. Simplified routines for circular apertures are included. However, the real power of the code comes from its basis in finite element methods. These methods allow the diffracting aperture to be virtually any geometric shape, including the various secondary aperture obstructions present in telescope systems. Aperture functions, with virtually any phase and amplitude variations, are allowed in the aperture openings. Step change aperture functions are accommodated. The incident waves are considered to be monochromatic. Plane waves, spherical waves, or Gaussian laser beams may be incident upon the apertures. Both area and line integral transformations were developed for the finite element based diffraction transformations. There is some loss of aperture function generality in the line integral transformations which are typically many times more computationally efficient than the area integral transformations when applicable to a particular problem.

  17. A comparison of cosmological hydrodynamic codes

    NASA Technical Reports Server (NTRS)

    Kang, Hyesung; Ostriker, Jeremiah P.; Cen, Renyue; Ryu, Dongsu; Hernquist, Lars; Evrard, August E.; Bryan, Greg L.; Norman, Michael L.

    1994-01-01

    We present a detailed comparison of the simulation results of various hydrodynamic codes. Starting with identical initial conditions based on the cold dark matter scenario for the growth of structure, with parameters h = 0.5 Omega = Omega(sub b) = 1, and sigma(sub 8) = 1, we integrate from redshift z = 20 to z = O to determine the physical state within a representative volume of size L(exp 3) where L = 64 h(exp -1) Mpc. Five indenpendent codes are compared: three of them Eulerian mesh-based and two variants of the smooth particle hydrodynamics 'SPH' Lagrangian approach. The Eulerian codes were run at N(exp 3) = (32(exp 3), 64(exp 3), 128(exp 3), and 256(exp 3)) cells, the SPH codes at N(exp 3) = 32(exp 3) and 64(exp 3) particles. Results were then rebinned to a 16(exp 3) grid with the exception that the rebinned data should converge, by all techniques, to a common and correct result as N approaches infinity. We find that global averages of various physical quantities do, as expected, tend to converge in the rebinned model, but that uncertainites in even primitive quantities such as (T), (rho(exp 2))(exp 1/2) persists at the 3%-17% level achieve comparable and satisfactory accuracy for comparable computer time in their treatment of the high-density, high-temeprature regions as measured in the rebinned data; the variance among the five codes (at highest resolution) for the mean temperature (as weighted by rho(exp 2) is only 4.5%. Examined at high resolution we suspect that the density resolution is better in the SPH codes and the thermal accuracy in low-density regions better in the Eulerian codes. In the low-density, low-temperature regions the SPH codes have poor accuracy due to statiscal effects, and the Jameson code gives the temperatures which are too high, due to overuse of artificial viscosity in these high Mach number regions. Overall the comparison allows us to better estimate errors; it points to ways of improving this current generation ofhydrodynamic

  18. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  19. Epetra developers coding guidelines.

    SciTech Connect

    Heroux, Michael Allen; Sexton, Paul Michael

    2003-12-01

    Epetra is a package of classes for the construction and use of serial and distributed parallel linear algebra objects. It is one of the base packages in Trilinos. This document describes guidelines for Epetra coding style. The issues discussed here go beyond correct C++ syntax to address issues that make code more readable and self-consistent. The guidelines presented here are intended to aid current and future development of Epetra specifically. They reflect design decisions that were made in the early development stages of Epetra. Some of the guidelines are contrary to more commonly used conventions, but we choose to continue these practices for the purposes of self-consistency. These guidelines are intended to be complimentary to policies established in the Trilinos Developers Guide.

  20. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  1. The NIMROD Code

    NASA Astrophysics Data System (ADS)

    Schnack, D. D.; Glasser, A. H.

    1996-11-01

    NIMROD is a new code system that is being developed for the analysis of modern fusion experiments. It is being designed from the beginning to make the maximum use of massively parallel computer architectures and computer graphics. The NIMROD physics kernel solves the three-dimensional, time-dependent two-fluid equations with neo-classical effects in toroidal geometry of arbitrary poloidal cross section. The NIMROD system also includes a pre-processor, a grid generator, and a post processor. User interaction with NIMROD is facilitated by a modern graphical user interface (GUI). The NIMROD project is using Quality Function Deployment (QFD) team management techniques to minimize re-engineering and reduce code development time. This paper gives an overview of the NIMROD project. Operation of the GUI is demonstrated, and the first results from the physics kernel are given.

  2. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  3. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  4. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  5. Efficient convolutional sparse coding

    DOEpatents

    Wohlberg, Brendt

    2017-06-20

    Computationally efficient algorithms may be applied for fast dictionary learning solving the convolutional sparse coding problem in the Fourier domain. More specifically, efficient convolutional sparse coding may be derived within an alternating direction method of multipliers (ADMM) framework that utilizes fast Fourier transforms (FFT) to solve the main linear system in the frequency domain. Such algorithms may enable a significant reduction in computational cost over conventional approaches by implementing a linear solver for the most critical and computationally expensive component of the conventional iterative algorithm. The theoretical computational cost of the algorithm may be reduced from O(M.sup.3N) to O(MN log N), where N is the dimensionality of the data and M is the number of elements in the dictionary. This significant improvement in efficiency may greatly increase the range of problems that can practically be addressed via convolutional sparse representations.

  6. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  7. The metaethics of nursing codes of ethics and conduct.

    PubMed

    Snelling, Paul C

    2016-10-01

    Nursing codes of ethics and conduct are features of professional practice across the world, and in the UK, the regulator has recently consulted on and published a new code. Initially part of a professionalising agenda, nursing codes have recently come to represent a managerialist and disciplinary agenda and nursing can no longer be regarded as a self-regulating profession. This paper argues that codes of ethics and codes of conduct are significantly different in form and function similar to the difference between ethics and law in everyday life. Some codes successfully integrate these two functions within the same document, while others, principally the UK Code, conflate them resulting in an ambiguous document unable to fulfil its functions effectively. The paper analyses the differences between ethical-codes and conduct-codes by discussing titles, authorship, level, scope for disagreement, consequences of transgression, language and finally and possibly most importantly agent-centeredness. It is argued that conduct-codes cannot require nurses to be compassionate because compassion involves an emotional response. The concept of kindness provides a plausible alternative for conduct-codes as it is possible to understand it solely in terms of acts. But if kindness is required in conduct-codes, investigation and possible censure follows from its absence. Using examples it is argued that there are at last five possible accounts of the absence of kindness. As well as being potentially problematic for disciplinary panels, difficulty in understanding the features of blameworthy absence of kindness may challenge UK nurses who, following a recently introduced revalidation procedure, are required to reflect on their practice in relation to The Code. It is concluded that closer attention to metaethical concerns by code writers will better support the functions of their issuing organisations.

  8. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  9. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  10. The Tau Code

    PubMed Central

    Avila, Jesús

    2009-01-01

    In this short review, I will focus on how a unique tau gene may produce many tau isoforms through alternative splicing and how the phosphorylation of these isoforms by different kinases may affect their activity and behaviour. Indeed, each of the different tau isoforms may play a distinct role under both physiological and pathological conditions. Thus, I will discuss whether a tau code exists that might explain the involvement of different tau isoforms in different cellular functions. PMID:20552052

  11. Trajectory Code Studies, 1987

    SciTech Connect

    Poukey, J.W.

    1988-01-01

    The trajectory code TRAJ has been used extensively to study nonimmersed foilless electron diodes. The basic goal of the research is to design low-emittance injectors for electron linacs and propagation experiments. Systems studied during 1987 include Delphi, Recirc, and Troll. We also discuss a partly successful attempt to extend the same techniques to high currents (tens of kA). 7 refs., 30 figs.

  12. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  13. HYCOM Code Development

    DTIC Science & Technology

    2003-02-10

    HYCOM code development Alan J. Wallcraft Naval Research Laboratory 2003 Layered Ocean Model Users’ Workshop February 10, 2003 Report Documentation...unlimited 13. SUPPLEMENTARY NOTES Layered Ocean Modeling Workshop (LOM 2003), Miami, FL, Feb 2003 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY...Kraus-Turner mixed-layer Æ Energy-Loan (passive) ice model Æ High frequency atmospheric forcing Æ New I/O scheme (.a and .b files) Æ Scalability via

  14. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  15. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic. Copyright © 2014. Published by Elsevier Masson SAS.

  16. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  17. The triple distribution of codes and ordered codes

    PubMed Central

    Trinker, Horst

    2011-01-01

    We study the distribution of triples of codewords of codes and ordered codes. Schrijver [A. Schrijver, New code upper bounds from the Terwilliger algebra and semidefinite programming, IEEE Trans. Inform. Theory 51 (8) (2005) 2859–2866] used the triple distribution of a code to establish a bound on the number of codewords based on semidefinite programming. In the first part of this work, we generalize this approach for ordered codes. In the second part, we consider linear codes and linear ordered codes and present a MacWilliams-type identity for the triple distribution of their dual code. Based on the non-negativity of this linear transform, we establish a linear programming bound and conclude with a table of parameters for which this bound yields better results than the standard linear programming bound. PMID:22505770

  18. Computer-Based Coding of Occupation Codes for Epidemiological Analyses

    PubMed Central

    Russ, Daniel E.; Ho, Kwan-Yuet; Johnson, Calvin A.; Friesen, Melissa C.

    2014-01-01

    Mapping job titles to standardized occupation classification (SOC) codes is an important step in evaluating changes in health risks over time as measured in inspection databases. However, manual SOC coding is cost prohibitive for very large studies. Computer based SOC coding systems can improve the efficiency of incorporating occupational risk factors into large-scale epidemiological studies. We present a novel method of mapping verbatim job titles to SOC codes using a large table of prior knowledge available in the public domain that included detailed description of the tasks and activities and their synonyms relevant to each SOC code. Job titles are compared to our knowledge base to find the closest matching SOC code. A soft Jaccard index is used to measure the similarity between a previously unseen job title and the knowledge base. Additional information such as standardized industrial codes can be incorporated to improve the SOC code determination by providing additional context to break ties in matches. PMID:25221787

  19. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  20. Design of convolutional tornado code

    NASA Astrophysics Data System (ADS)

    Zhou, Hui; Yang, Yao; Gao, Hongmin; Tan, Lu

    2017-09-01

    As a linear block code, the traditional tornado (tTN) code is inefficient in burst-erasure environment and its multi-level structure may lead to high encoding/decoding complexity. This paper presents a convolutional tornado (cTN) code which is able to improve the burst-erasure protection capability by applying the convolution property to the tTN code, and reduce computational complexity by abrogating the multi-level structure. The simulation results show that cTN code can provide a better packet loss protection performance with lower computation complexity than tTN code.

  1. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  2. Construction of new quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  3. FRAPCON-3: Integral assessment

    SciTech Connect

    Lanning, D.D.; Berna, G.A.; Berna, G.A.

    1997-12-01

    An integral assessment has been performed for the U.S. Nuclear Regulatory Commission by Pacific Northwest National Laboratory to quantify the predictive capabilities of FRAPCON-3, a steady-state fuel behavior code designed to analyze fuel behavior from beginning-of-life to burnup levels of 65 GWd/MTU. FRAPCON-3 code calculations are shown to compare satisfactorily to a pre-selected set of experimental data with steady-state operating conditions. 30 refs., 27 figs., 18 tabs.

  4. Enhanced confidentiality using OCDM-based code scrambling and self-obscuration.

    PubMed

    Agarwal, A; Menendez, R; Toliver, P; Banwell, T; Jackel, J; Etemad, S

    2008-02-04

    Code scrambling combined with self obscuration in a spectral phase encoded OCDM system shows promise for enhanced confidentiality in high data rate networks. We demonstrate code scrambling using reconfigurable ring-resonator-based integrated coders for obscuring a 20 Gb/s OCDM signal comprising four polarization multiplexed coded tributaries.

  5. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  6. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  7. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  8. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  9. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  10. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  11. A class of constacyclic BCH codes and new quantum codes

    NASA Astrophysics Data System (ADS)

    liu, Yang; Li, Ruihu; Lv, Liangdong; Ma, Yuena

    2017-03-01

    Constacyclic BCH codes have been widely studied in the literature and have been used to construct quantum codes in latest years. However, for the class of quantum codes of length n=q^{2m}+1 over F_{q^2} with q an odd prime power, there are only the ones of distance δ ≤ 2q^2 are obtained in the literature. In this paper, by a detailed analysis of properties of q2-ary cyclotomic cosets, maximum designed distance δ _{max} of a class of Hermitian dual-containing constacyclic BCH codes with length n=q^{2m}+1 are determined, this class of constacyclic codes has some characteristic analog to that of primitive BCH codes over F_{q^2}. Then we can obtain a sequence of dual-containing constacyclic codes of designed distances 2q^2<δ ≤ δ _{max}. Consequently, new quantum codes with distance d > 2q^2 can be constructed from these dual-containing codes via Hermitian Construction. These newly obtained quantum codes have better code rate compared with those constructed from primitive BCH codes.

  12. New optimal asymmetric quantum codes from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Zhang, Guanghui; Chen, Bocong; Li, Liangchen

    2014-06-01

    In this paper, we construct two classes of asymmetric quantum codes by using constacyclic codes. The first class is the asymmetric quantum codes with parameters [[q2 + 1, q2 + 1 - 2(t + k + 1), (2k + 2)/(2t + 2)

  13. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  14. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  15. Using Quick Response Codes in the Classroom: Quality Outcomes.

    PubMed

    Zurmehly, Joyce; Adams, Kellie

    2017-10-01

    With smart device technology emerging, educators are challenged with redesigning teaching strategies using technology to allow students to participate dynamically and provide immediate answers. To facilitate integration of technology and to actively engage students, quick response codes were included in a medical surgical lecture. Quick response codes are two-dimensional square patterns that enable the coding or storage of more than 7000 characters that can be accessed via a quick response code scanning application. The aim of this quasi-experimental study was to explore quick response code use in a lecture and measure students' satisfaction (met expectations, increased interest, helped understand, and provided practice and prompt feedback) and engagement (liked most, liked least, wanted changed, and kept involved), assessed using an investigator-developed instrument. Although there was no statistically significant correlation of quick response use to examination scores, satisfaction scores were high, and there was a small yet positive association between how students perceived their learning with quick response codes and overall examination scores. Furthermore, on open-ended survey questions, students responded that they were satisfied with the use of quick response codes, appreciated the immediate feedback, and planned to use them in the clinical setting. Quick response codes offer a way to integrate technology into the classroom to provide students with instant positive feedback.

  16. A MCTF video coding scheme based on distributed source coding principles

    NASA Astrophysics Data System (ADS)

    Tagliasacchi, Marco; Tubaro, Stefano

    2005-07-01

    Motion Compensated Temporal Filtering (MCTF) has proved to be an efficient coding tool in the design of open-loop scalable video codecs. In this paper we propose a MCTF video coding scheme based on lifting where the prediction step is implemented using PRISM (Power efficient, Robust, hIgh compression Syndrome-based Multimedia coding), a video coding framework built on distributed source coding principles. We study the effect of integrating the update step at the encoder or at the decoder side. We show that the latter approach allows to improve the quality of the side information exploited during decoding. We present the analytical results obtained by modeling the video signal along the motion trajectories as a first order auto-regressive process. We show that the update step at the decoder allows to half the contribution of the quantization noise. We also include experimental results with real video data that demonstrate the potential of this approach when the video sequences are coded at low bitrates.

  17. Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes

    SciTech Connect

    Burk, K.W.; Andrews, G.L.

    1989-02-01

    The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to the Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.

  18. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  19. On lossless coding for HEVC

    NASA Astrophysics Data System (ADS)

    Gao, Wen; Jiang, Minqiang; Yu, Haoping

    2013-02-01

    In this paper, we first review the lossless coding mode in the version 1 of the HEVC standard that has recently finalized. We then provide a performance comparison between the lossless coding mode in the HEVC and MPEG-AVC/H.264 standards and show that the HEVC lossless coding has limited coding efficiency. To improve the performance of the lossless coding mode, several new coding tools that were contributed to JCT-VC but not adopted in version 1 of HEVC standard are introduced. In particular, we discuss sample based intra prediction and coding of residual coefficients in more detail. At the end, we briefly address a new class of coding tools, i.e., a dictionary-based coder, that is efficient in encoding screen content including graphics and text.

  20. Summary of 1990 Code Conference

    SciTech Connect

    Cooper, R.K.; Chan, Kwok-Chi D.

    1990-01-01

    The Conference on Codes and the Linear Accelerator Community was held in Los Alamos in January 1990, and had approximately 100 participants. This conference was the second in a series which has as its goal the exchange of information about codes and code practices among those writing and actually using these codes for the design and analysis of linear accelerators and their components. The first conference was held in San Diego in January 1988, and concentrated on beam dynamics codes and Maxwell solvers. This most recent conference concentrated on 3-D codes and techniques to handle the large amounts of data required for three-dimensional problems. In addition to descriptions of codes, their algorithms and implementations, there were a number of paper describing the use of many of the codes. Proceedings of both these conferences are available. 3 refs., 2 tabs.

  1. ENSDF ANALYSIS AND UTILITY CODES.

    SciTech Connect

    BURROWS, T.

    2005-04-04

    The ENSDF analysis and checking codes are briefly described, along with their uses with various types of ENSDF datasets. For more information on the programs see ''Read Me'' entries and other documentation associated with each code.

  2. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  3. On quantum codes obtained from cyclic codes over A2

    NASA Astrophysics Data System (ADS)

    Dertli, Abdullah; Cengellenmis, Yasemin; Eren, Senol

    2015-05-01

    In this paper, quantum codes from cyclic codes over A2 = F2 + uF2 + vF2 + uvF2, u2 = u, v2 = v, uv = vu, for arbitrary length n have been constructed. It is shown that if C is self orthogonal over A2, then so is Ψ(C), where Ψ is a Gray map. A necessary and sufficient condition for cyclic codes over A2 that contains its dual has also been given. Finally, the parameters of quantum error correcting codes are obtained from cyclic codes over A2.

  4. Code stroke in Asturias.

    PubMed

    Benavente, L; Villanueva, M J; Vega, P; Casado, I; Vidal, J A; Castaño, B; Amorín, M; de la Vega, V; Santos, H; Trigo, A; Gómez, M B; Larrosa, D; Temprano, T; González, M; Murias, E; Calleja, S

    2016-04-01

    Intravenous thrombolysis with alteplase is an effective treatment for ischaemic stroke when applied during the first 4.5 hours, but less than 15% of patients have access to this technique. Mechanical thrombectomy is more frequently able to recanalise proximal occlusions in large vessels, but the infrastructure it requires makes it even less available. We describe the implementation of code stroke in Asturias, as well as the process of adapting various existing resources for urgent stroke care in the region. By considering these resources, and the demographic and geographic circumstances of our region, we examine ways of reorganising the code stroke protocol that would optimise treatment times and provide the most appropriate treatment for each patient. We distributed the 8 health districts in Asturias so as to permit referral of candidates for reperfusion therapies to either of the 2 hospitals with 24-hour stroke units and on-call neurologists and providing IV fibrinolysis. Hospitals were assigned according to proximity and stroke severity; the most severe cases were immediately referred to the hospital with on-call interventional neurology care. Patient triage was provided by pre-hospital emergency services according to the NIHSS score. Modifications to code stroke in Asturias have allowed us to apply reperfusion therapies with good results, while emphasising equitable care and managing the severity-time ratio to offer the best and safest treatment for each patient as soon as possible. Copyright © 2015 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  5. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  6. Improved code-tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T.

    1980-01-01

    Delay-locked loop tracks pseudonoise codes without introducing dc timing errors, because it is not sensitive to gain imbalance between signal processing arms. "Early" and "late" reference codes pass in combined form through both arms, and each arm acts on both codes. Circuit accomodates 1 dB weaker input signals with tracking ability equal to that of tau-dither loops.

  7. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  8. Coding Major Fields of Study.

    ERIC Educational Resources Information Center

    Bobbitt, L. G.; Carroll, C. D.

    The National Center for Education Statistics conducts surveys which require the coding of the respondent's major field of study. This paper presents a new system for the coding of major field of study. It operates on-line i a Computer Assisted Telephone Interview (CATI) environment and allows conversational checks to verify coding directly from…

  9. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  10. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  11. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  12. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  13. Generating Customized Verifiers for Automatically Generated Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2008-01-01

    Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.

  14. Quantum Codes From Cyclic Codes Over The Ring R2

    NASA Astrophysics Data System (ADS)

    Altinel, Alev; Güzeltepe, Murat

    2016-10-01

    Let R 2 denotes the ring F 2 + μF 2 + υ2 + μυF 2 + wF 2 + μwF 2 + υwF 2 + μυwF2. In this study, we construct quantum codes from cyclic codes over the ring R2, for arbitrary length n, with the restrictions μ2 = 0, υ2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R2 and we give an example of quantum error-correcting codes form cyclic codes over R 2.

  15. Structured error recovery for code-word-stabilized quantum codes

    NASA Astrophysics Data System (ADS)

    Li, Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-01

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3t times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  16. The random coding bound is tight for the average code.

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.

    1973-01-01

    The random coding bound of information theory provides a well-known upper bound to the probability of decoding error for the best code of a given rate and block length. The bound is constructed by upperbounding the average error probability over an ensemble of codes. The bound is known to give the correct exponential dependence of error probability on block length for transmission rates above the critical rate, but it gives an incorrect exponential dependence at rates below a second lower critical rate. Here we derive an asymptotic expression for the average error probability over the ensemble of codes used in the random coding bound. The result shows that the weakness of the random coding bound at rates below the second critical rate is due not to upperbounding the ensemble average, but rather to the fact that the best codes are much better than the average at low rates.

  17. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  18. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  19. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  20. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  1. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  2. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  3. Determinate-state convolutional codes

    NASA Technical Reports Server (NTRS)

    Collins, O.; Hizlan, M.

    1991-01-01

    A determinate state convolutional code is formed from a conventional convolutional code by pruning away some of the possible state transitions in the decoding trellis. The type of staged power transfer used in determinate state convolutional codes proves to be an extremely efficient way of enhancing the performance of a concatenated coding system. The decoder complexity is analyzed along with free distances of these new codes and extensive simulation results is provided of their performance at the low signal to noise ratios where a real communication system would operate. Concise, practical examples are provided.

  4. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  5. Recent advances in the CONTAIN code

    SciTech Connect

    Bergeron, K.D.; Carroll, D.E.; Gelbard, F.; Murata, K.K.; Valdez, G.D.; Washington, K.E.

    1987-10-01

    An update is given on very recent developments involving CONTAIN, the USNRC's principal mechanistic code for severe accident containment analysis. First, the features are outlined in two major new releases of CONTAIN. Revision 1.06 was released in February; the major improvements include full integration of the CORCON and VANESA models for debris concrete interactions and concomitant aerosol generation, more detailed and more flexible radiation heat transfer options, and a number of minor improvements. The most recent new version of the code is CONTAIN 1.1, which was released in October. The principal new features relate to the Boiling Water Reactor. In particular, working models are included for Pressure Suppression Pools and Safety Relief Valves. In addition, this version of the code has a much-improved treatment of fision product hosting, user-defined material property options, and a number of other improvements. A second major area of progress involves the aerosol models. Previously, numerical diffusion limited the accuracy of the calculation of the concentration of the smallest particles and, there was no accounting for the effects of soluble salts or surface tension on the equilibrium water vapor pressure. All of these problems have now been solved with a stand-alone aerosol modeling code which uses a suite of new numerical approaches. The new methods have been incorporated into CONTAIN. Example calculations are presented. 7 refs., 4 figs., 1 tab.

  6. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes.

  7. Integrated Analysis of Long Non-coding RNAs (LncRNAs) and mRNA Expression Profiles Reveals the Potential Role of LncRNAs in Skeletal Muscle Development of the Chicken

    PubMed Central

    Li, Zhenhui; Ouyang, Hongjia; Zheng, Ming; Cai, Bolin; Han, Peigong; Abdalla, Bahareldin A.; Nie, Qinghua; Zhang, Xiquan

    2017-01-01

    Long non-coding RNAs (lncRNAs) play important roles in transcriptional and post-transcriptional regulation. However, little is currently known about the mechanisms by which they regulate skeletal muscle development in the chicken. In this study, we used RNA sequencing to profile the leg muscle transcriptome (lncRNA and mRNA) at three stages of skeletal muscle development in the chicken: embryonic day 11 (E11), embryonic day 16 (E16), and 1 day after hatching (D1). In total, 129, 132, and 45 differentially expressed lncRNAs, and 1798, 3072, and 1211 differentially expressed mRNAs were identified in comparisons of E11 vs. E16, E11 vs. D1, and E16 vs. D1, respectively. Moreover, we identified the cis- and trans-regulatory target genes of differentially expressed lncRNAs, and constructed lncRNA-gene interaction networks. In total, 126 and 200 cis-targets, and two and three trans-targets were involved in lncRNA-gene interaction networks that were constructed based on the E11 vs. E16, and E11 vs. D1 comparisons, respectively. The comparison of the E16 vs. D1 lncRNA-gene network comprised 25 cis-targets. We determined that lncRNA target genes are potentially involved in cellular development, and cellular growth and proliferation using Ingenuity Pathway Analysis. The gene networks identified for the E11 vs. D1 comparison were involved in embryonic development, organismal development and tissue development. The present study provides an RNA sequencing based evaluation of lncRNA function during skeletal muscle development in the chicken. Comprehensive analysis facilitated the identification of lncRNAs and target genes that might contribute to the regulation of different stages of skeletal muscle development. PMID:28119630

  8. Subspace-Aware Index Codes

    DOE PAGES

    Kailkhura, Bhavya; Theagarajan, Lakshmi Narasimhan; Varshney, Pramod K.

    2017-04-12

    In this paper, we generalize the well-known index coding problem to exploit the structure in the source-data to improve system throughput. In many applications (e.g., multimedia), the data to be transmitted may lie (or can be well approximated) in a low-dimensional subspace. We exploit this low-dimensional structure of the data using an algebraic framework to solve the index coding problem (referred to as subspace-aware index coding) as opposed to the traditional index coding problem which is subspace-unaware. Also, we propose an efficient algorithm based on the alternating minimization approach to obtain near optimal index codes for both subspace-aware and -unawaremore » cases. In conclusion, our simulations indicate that under certain conditions, a significant throughput gain (about 90%) can be achieved by subspace-aware index codes over conventional subspace-unaware index codes.« less

  9. Coded Apertures in Mass Spectrometry.

    PubMed

    Amsden, Jason J; Gehm, Michael E; Russell, Zachary E; Chen, Evan X; Di Dona, Shane T; Wolter, Scott D; Danell, Ryan M; Kibelka, Gottfried; Parker, Charles B; Stoner, Brian R; Brady, David J; Glass, Jeffrey T

    2017-06-12

    The use of coded apertures in mass spectrometry can break the trade-off between throughput and resolution that has historically plagued conventional instruments. Despite their very early stage of development, coded apertures have been shown to increase throughput by more than one order of magnitude, with no loss in resolution in a simple 90-degree magnetic sector. This enhanced throughput can increase the signal level with respect to the underlying noise, thereby significantly improving sensitivity to low concentrations of analyte. Simultaneous resolution can be maintained, preventing any decrease in selectivity. Both one- and two-dimensional (2D) codes have been demonstrated. A 2D code can provide increased measurement diversity and therefore improved numerical conditioning of the mass spectrum that is reconstructed from the coded signal. This review discusses the state of development, the applications where coding is expected to provide added value, and the various instrument modifications necessary to implement coded apertures in mass spectrometers.

  10. Visual pattern image sequence coding

    NASA Technical Reports Server (NTRS)

    Silsbee, Peter; Bovik, Alan C.; Chen, Dapang

    1990-01-01

    The visual pattern image coding (VPIC) configurable digital image-coding process is capable of coding with visual fidelity comparable to the best available techniques, at compressions which (at 30-40:1) exceed all other technologies. These capabilities are associated with unprecedented coding efficiencies; coding and decoding operations are entirely linear with respect to image size and entail a complexity that is 1-2 orders of magnitude faster than any previous high-compression technique. The visual pattern image sequence coding to which attention is presently given exploits all the advantages of the static VPIC in the reduction of information from an additional, temporal dimension, to achieve unprecedented image sequence coding performance.

  11. Orthogonal coding of object location.

    PubMed

    Knutsen, Per Magne; Ahissar, Ehud

    2009-02-01

    It has been argued whether internal representations are encoded using a universal ('the neural code') or multiple codes. Here, we review a series of experiments that demonstrate that tactile encoding of object location via whisking employs an orthogonal, triple-code scheme. Rats, and other rodents, actively move the whiskers back and forth to localize and identify objects. Neural recordings from primary sensory afferents, along with behavioral observations, demonstrate that vertical coordinates of contacted objects are encoded by the identity of activated afferents, horizontal coordinates by the timing of activation and radial coordinates by the intensity of activation. Because these codes are mutually independent, the three-dimensional location of an object could, in principle, be encoded by individual afferents during single whisker-object contacts. One advantage of such a same-neuron-different-codes scheme over the traditionally assumed same-code-different-neurons scheme is a reduction of code ambiguity that, in turn, simplifies decoding circuits.

  12. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  13. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  14. Color-Coded Organelles.

    ERIC Educational Resources Information Center

    McLaughlin, Esther; And Others

    1994-01-01

    Describes how red beets can be used to demonstrate a variety of membrane phenomena. Some of the activities include observation of vacuoles; vacuoles in intact cells; isolation of vacuoles in physiological studies; demonstration of membrane integrity; and demonstration of ion diffusion and active transport with purified vacuoles. (ZWH)

  15. Color-Coded Organelles.

    ERIC Educational Resources Information Center

    McLaughlin, Esther; And Others

    1994-01-01

    Describes how red beets can be used to demonstrate a variety of membrane phenomena. Some of the activities include observation of vacuoles; vacuoles in intact cells; isolation of vacuoles in physiological studies; demonstration of membrane integrity; and demonstration of ion diffusion and active transport with purified vacuoles. (ZWH)

  16. The CMA Code of Ethics: more room for reflection.

    PubMed

    Kenny, N P

    1996-10-15

    Codes of ethics stand as a promise to society about the integrity of a profession in return for the power and authority given to that profession by society. The revised CMA Code of Ethics (see pages 1176A to 1176B) is timely and significant and should be applauded and supported by all physicians. It speaks clearly to competence, high standards of practice and communication, and the importance of informed patient choice. Nonetheless, the code provides no systematic justification for the principles it asserts. Although these principles are helpful tools, they are insufficient to resolve major ethical dilemmas. The code provides no means of ordering ethical priorities and fails to address issues such as peer review and conflict of interest. It is deafeningly silent on both abortion and euthanasia. In view of these limitations, the code must be seen as an important but unfinished reflection on the essence of being a good physician.

  17. The Application of the PEBBED Code Suite to the PBMR-400 Coupled Code Benchmark - FY 2006 Annual Report

    SciTech Connect

    Not Available

    2006-09-01

    This document describes the recent developments of the PEBBED code suite and its application to the PBMR-400 Coupled Code Benchmark. This report addresses an FY2006 Level 2 milestone under the NGNP Design and Evaluation Methods Work Package. The milestone states "Complete a report describing the results of the application of the integrated PEBBED code package to the PBMR-400 coupled code benchmark". The report describes the current state of the PEBBED code suite, provides an overview of the Benchmark problems to which it was applied, discusses the code developments achieved in the past year, and states some of the results attained. Results of the steady state problems generated by the PEBBED fuel management code compare favorably to the preliminary results generated by codes from other participating institutions and to similar non-Benchmark analyses. Partial transient analysis capability has been achieved through the acquisition of the NEM-THERMIX code from Penn State University. Phase I of the task has been achieved through the development of a self-consistent set of tools for generating cross sections for design and transient analysis and in the successful execution of the steady state benchmark exercises.

  18. A Simple Tight Bound on Error Probability of Block Codes with Application to Turbo Codes

    NASA Astrophysics Data System (ADS)

    Divsalar, D.

    1999-07-01

    A simple bound on the probability of decoding error for block codes is derived in closed form. This bound is based on the bounding techniques developed by Gallager. We obtained an upper bound both on the word-error probability and the bit-error probability of block codes. The bound is simple, since it does not require any integration or optimization in its final version. The bound is tight since it works for signal-to-noise ratios (SNRs) very close to the Shannon capacity limit. The bound uses only the weight distribution of the code. The bound for nonrandom codes is tighter than the original Gallager bound and its new versions derived by Sason and Shamai and by Viterbi and Viterbi. It also is tighter than the recent simpler bound by Viterbi and Viterbi and simpler than the bound by Duman and Salehi, which requires two-parameter optimization. For long blocks, it competes well with more complex bounds that involve integration and parameter optimization, such as the tangential sphere bound by Poltyrev, elaborated by Sason and Shamai, and investigated by Viterbi and Viterbi, and the geometry bound by Dolinar, Ekroot, and Pollara. We also obtained a closed-form expression for the minimum SNR threshold that can serve as a tight upper bound on maximum-likelihood capacity of nonrandom codes. We also have shown that this minimum SNR threshold of our bound is the same as for the tangential sphere bound of Poltyrev. We applied this simple bound to turbo-like codes.

  19. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  20. Surface code quantum communication.

    PubMed

    Fowler, Austin G; Wang, David S; Hill, Charles D; Ladd, Thaddeus D; Van Meter, Rodney; Hollenberg, Lloyd C L

    2010-05-07

    Quantum communication typically involves a linear chain of repeater stations, each capable of reliable local quantum computation and connected to their nearest neighbors by unreliable communication links. The communication rate of existing protocols is low as two-way classical communication is used. By using a surface code across the repeater chain and generating Bell pairs between neighboring stations with probability of heralded success greater than 0.65 and fidelity greater than 0.96, we show that two-way communication can be avoided and quantum information can be sent over arbitrary distances with arbitrarily low error at a rate limited only by the local gate speed. This is achieved by using the unreliable Bell pairs to measure nonlocal stabilizers and feeding heralded failure information into post-transmission error correction. Our scheme also applies when the probability of heralded success is arbitrarily low.