Science.gov

Sample records for integrated codes hedric

  1. Parameters used in the environmental pathways and radiological dose modules (DESCARTES, CIDER, and CRD codes) of the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC)

    SciTech Connect

    Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.

    1994-05-01

    This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site during the period of 1944 to 1992. This work is being done by staff at Battelle, Pacific Northwest Laboratories under a contract with the Centers for Disease Control and Prevention with technical direction provided by an independent Technical Steering Panel (TSP).

  2. Parameters used in the environmental pathways (DESCARTES) and radiological dose (CIDER) modules of the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC) for the air pathway

    SciTech Connect

    Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.

    1992-09-01

    This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site since 1944. This work is being done by staff at Battelle, Pacific Northwest Laboratories (Battelle) under a contract with the Centers for Disease Control (CDC) with technical direction provided by an independent Technical Steering Panel (TSP). The objective of this report is to-document the environmental accumulation and dose-assessment parameters that will be used to estimate the impacts of past Hanford Site airborne releases. During 1993, dose estimates made by staff at Battelle will be used by the Fred Hutchinson Cancer Research Center as part of the Hanford Thyroid Disease Study (HTDS). This document contains information on parameters that are specific to the airborne release of the radionuclide iodine-131. Future versions of this document will include parameter information pertinent to other pathways and radionuclides.

  3. Integrated Codes for Estimating Environmental Accumulation and Individual Dose from Past Hanford Atmospheric Releases: Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Ikenberry, T. A.; Burnett, R. A.; Napier, B. A.; Reitz, N. A.; Shipler, D. B.

    1992-02-01

    Preliminary radiation doses were estimated and reported during Phase I of the Hanford Environmental Dose Reconstruction (HEDR) Project. As the project has progressed, additional information regarding the magnitude and timing of past radioactive releases has been developed, and the general scope of the required calculations has been enhanced. The overall HEDR computational model for computing doses attributable to atmospheric releases from Hanford Site operations is called HEDRIC (Hanford Environmental Dose Reconstruction Integrated Codes). It consists of four interrelated models: source term, atmospheric transport, environmental accumulation, and individual dose. The source term and atmospheric transport models are documented elsewhere. This report describes the initial implementation of the design specifications for the environmental accumulation model and computer code, called DESCARTES (Dynamic EStimates of Concentrations and Accumulated Radionuclides in Terrestrial Environments), and the individual dose model and computer code, called CIDER (Calculation of Individual Doses from Environmental Radionuclides). The computations required of these models and the design specifications for their codes were documented in Napier et al. (1992). Revisions to the original specifications and the basis for modeling decisions are explained. This report is not the final code documentation but gives the status of the model and code development to date. Final code documentation is scheduled to be completed in FY 1994 following additional code upgrades and refinements. The user's guide included in this report describes the operation of the environmental accumulation and individual dose codes and associated pre- and post-processor programs. A programmer's guide describes the logical structure of the programs and their input and output files.

  4. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  5. CBP PHASE I CODE INTEGRATION

    SciTech Connect

    Smith, F.; Brown, K.; Flach, G.; Sarkar, S.

    2011-09-30

    The goal of the Cementitious Barriers Partnership (CBP) is to develop a reasonable and credible set of software tools to predict the structural, hydraulic, and chemical performance of cement barriers used in nuclear applications over extended time frames (greater than 100 years for operating facilities and greater than 1000 years for waste management). The simulation tools will be used to evaluate and predict the behavior of cementitious barriers used in near surface engineered waste disposal systems including waste forms, containment structures, entombments, and environmental remediation. These cementitious materials are exposed to dynamic environmental conditions that cause changes in material properties via (i) aging, (ii) chloride attack, (iii) sulfate attack, (iv) carbonation, (v) oxidation, and (vi) primary constituent leaching. A set of state-of-the-art software tools has been selected as a starting point to capture these important aging and degradation phenomena. Integration of existing software developed by the CBP partner organizations was determined to be the quickest method of meeting the CBP goal of providing a computational tool that improves the prediction of the long-term behavior of cementitious materials. These partner codes were selected based on their maturity and ability to address the problems outlined above. The GoldSim Monte Carlo simulation program (GTG 2010a, GTG 2010b) was chosen as the code integration platform (Brown & Flach 2009b). GoldSim (current Version 10.5) is a Windows based graphical object-oriented computer program that provides a flexible environment for model development (Brown & Flach 2009b). The linking of GoldSim to external codes has previously been successfully demonstrated (Eary 2007, Mattie et al. 2007). GoldSim is capable of performing deterministic and probabilistic simulations and of modeling radioactive decay and constituent transport. As part of the CBP project, a general Dynamic Link Library (DLL) interface was

  6. The Fireball integrated code package

    SciTech Connect

    Dobranich, D.; Powers, D.A.; Harper, F.T.

    1997-07-01

    Many deep-space satellites contain a plutonium heat source. An explosion, during launch, of a rocket carrying such a satellite offers the potential for the release of some of the plutonium. The fireball following such an explosion exposes any released plutonium to a high-temperature chemically-reactive environment. Vaporization, condensation, and agglomeration processes can alter the distribution of plutonium-bearing particles. The Fireball code package simulates the integrated response of the physical and chemical processes occurring in a fireball and the effect these processes have on the plutonium-bearing particle distribution. This integrated treatment of multiple phenomena represents a significant improvement in the state of the art for fireball simulations. Preliminary simulations of launch-second scenarios indicate: (1) most plutonium vaporization occurs within the first second of the fireball; (2) large non-aerosol-sized particles contribute very little to plutonium vapor production; (3) vaporization and both homogeneous and heterogeneous condensation occur simultaneously; (4) homogeneous condensation transports plutonium down to the smallest-particle sizes; (5) heterogeneous condensation precludes homogeneous condensation if sufficient condensation sites are available; and (6) agglomeration produces larger-sized particles but slows rapidly as the fireball grows.

  7. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  8. MINET (momentum integral network) code documentation

    SciTech Connect

    Van Tuyle, G J; Nepsee, T C; Guppy, J G

    1989-12-01

    The MINET computer code, developed for the transient analysis of fluid flow and heat transfer, is documented in this four-part reference. In Part 1, the MINET models, which are based on a momentum integral network method, are described. The various aspects of utilizing the MINET code are discussed in Part 2, The User's Manual. The third part is a code description, detailing the basic code structure and the various subroutines and functions that make up MINET. In Part 4, example input decks, as well as recent validation studies and applications of MINET are summarized. 32 refs., 36 figs., 47 tabs.

  9. Parameters used in the environmental pathways (DESCARTES) and radiological dose (CIDER) modules of the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC) for the air pathway. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Snyder, S.F.; Farris, W.T.; Napier, B.A.; Ikenberry, T.A.; Gilbert, R.O.

    1992-09-01

    This letter report is a description of work performed for the Hanford Environmental Dose Reconstruction (HEDR) Project. The HEDR Project was established to estimate the radiation doses to individuals resulting from releases of radionuclides from the Hanford Site since 1944. This work is being done by staff at Battelle, Pacific Northwest Laboratories (Battelle) under a contract with the Centers for Disease Control (CDC) with technical direction provided by an independent Technical Steering Panel (TSP). The objective of this report is to-document the environmental accumulation and dose-assessment parameters that will be used to estimate the impacts of past Hanford Site airborne releases. During 1993, dose estimates made by staff at Battelle will be used by the Fred Hutchinson Cancer Research Center as part of the Hanford Thyroid Disease Study (HTDS). This document contains information on parameters that are specific to the airborne release of the radionuclide iodine-131. Future versions of this document will include parameter information pertinent to other pathways and radionuclides.

  10. An integrated radiation physics computer code system.

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Harris, D. W.

    1972-01-01

    An integrated computer code system for the semi-automatic and rapid analysis of experimental and analytic problems in gamma photon and fast neutron radiation physics is presented. Such problems as the design of optimum radiation shields and radioisotope power source configurations may be studied. The system codes allow for the unfolding of complex neutron and gamma photon experimental spectra. Monte Carlo and analytic techniques are used for the theoretical prediction of radiation transport. The system includes a multichannel pulse-height analyzer scintillation and semiconductor spectrometer coupled to an on-line digital computer with appropriate peripheral equipment. The system is geometry generalized as well as self-contained with respect to material nuclear cross sections and the determination of the spectrometer response functions. Input data may be either analytic or experimental.

  11. The UNC Charlotte Code of Student Academic Integrity.

    ERIC Educational Resources Information Center

    Toenjes, Richard H.

    The University of North Carolina at Charlotte has developed a Code of Student Academic Integrity as a solution for problems of academic dishonesty in large, young, public, urban universities where traditional honor codes will not work. The Code defines violations and gives typical examples. It provides a friendly settlement procedure in which a…

  12. Integrating Proactive Discipline Practices into Codes of Conduct

    ERIC Educational Resources Information Center

    Fenning, Pamela; Theodos, Jennifer; Benner, Courtney; Bohanon-Edmonson, Hank

    2004-01-01

    The purpose of this article is to advocate for proactive content in discipline codes of conduct. Proactive discipline codes integrate Positive Behavior Support (PBS) strategies (Sugai & Horner, 2002) and attend to the academic needs of students (McEvoy & Welker, 2000). Proactive discipline codes of conduct have meaningful participation by key…

  13. Definite Integrals, Some Involving Residue Theory Evaluated by Maple Code

    SciTech Connect

    Bowman, Kimiko o

    2010-01-01

    The calculus of residue is applied to evaluate certain integrals in the range (-{infinity} to {infinity}) using the Maple symbolic code. These integrals are of the form {integral}{sub -{infinity}}{sup {infinity}} cos(x)/[(x{sup 2} + a{sup 2})(x{sup 2} + b{sup 2}) (x{sup 2} + c{sup 2})]dx and similar extensions. The Maple code is also applied to expressions in maximum likelihood estimator moments when sampling from the negative binomial distribution. In general the Maple code approach to the integrals gives correct answers to specified decimal places, but the symbolic result may be extremely long and complex.

  14. Boundary layer integral matrix procedure code modifications and verifications

    NASA Technical Reports Server (NTRS)

    Evans, R. M.; Morse, H. L.

    1974-01-01

    A summary of modifications to Aerotherm's Boundary Layer Integral Matrix Procedure (BLIMP) code is presented. These modifications represent a preliminary effort to make BLIMP compatible with other JANNAF codes and to adjust the code for specific application to rocket nozzle flows. Results of the initial verification of the code for prediction of rocket nozzle type flows are discussed. For those cases in which measured free stream flow conditions were used as input to the code, the boundary layer predictions and measurements are in excellent agreement. In two cases, with free stream flow conditions calculated by another JANNAF code (TDK) for use as input to BLIMP, the predictions and the data were in fair agreement for one case and in poor agreement for the other case. The poor agreement is believed to result from failure of the turbulent model in BLIMP to account for laminarization of a turbulent flow. Recommendations for further code modifications and improvements are also presented.

  15. A new integrated symmetrical table for genetic codes.

    PubMed

    Shu, Jian-Jun

    2017-01-01

    Degeneracy is a salient feature of genetic codes, because there are more codons than amino acids. The conventional table for genetic codes suffers from an inability of illustrating a symmetrical nature among genetic base codes. In fact, because the conventional wisdom avoids the question, there is little agreement as to whether the symmetrical nature actually even exists. A better understanding of symmetry and an appreciation for its essential role in the genetic code formation can improve our understanding of nature's coding processes. Thus, it is worth formulating a new integrated symmetrical table for genetic codes, which is presented in this paper. It could be very useful to understand the Nobel laureate Crick's wobble hypothesis - how one transfer ribonucleic acid can recognize two or more synonymous codons, which is an unsolved fundamental question in biological science.

  16. Progress in Advanced Spray Combustion Code Integration

    NASA Technical Reports Server (NTRS)

    Liang, Pak-Yan

    1993-01-01

    A multiyear project to assemble a robust, muitiphase spray combustion code is now underway and gradually building up to full speed. The overall effort involves several university and government research teams as well as Rocketdyne. The first part of this paper will give an overview of the respective roles of the different participants involved, the master strategy, the evolutionary milestones, and an assessment of the state-of-the-art of various key components. The second half of this paper will highlight the progress made to date in extending the baseline Navier-Stokes solver to handle multiphase, multispecies, chemically reactive sub- to supersonic flows. The major hurdles to overcome in order to achieve significant speed ups are delineated and the approaches to overcoming them will be discussed.

  17. Foundational development of an advanced nuclear reactor integrated safety code.

    SciTech Connect

    Clarno, Kevin; Lorber, Alfred Abraham; Pryor, Richard J.; Spotz, William F.; Schmidt, Rodney Cannon; Belcourt, Kenneth; Hooper, Russell Warren; Humphries, Larry LaRon

    2010-02-01

    This report describes the activities and results of a Sandia LDRD project whose objective was to develop and demonstrate foundational aspects of a next-generation nuclear reactor safety code that leverages advanced computational technology. The project scope was directed towards the systems-level modeling and simulation of an advanced, sodium cooled fast reactor, but the approach developed has a more general applicability. The major accomplishments of the LDRD are centered around the following two activities. (1) The development and testing of LIME, a Lightweight Integrating Multi-physics Environment for coupling codes that is designed to enable both 'legacy' and 'new' physics codes to be combined and strongly coupled using advanced nonlinear solution methods. (2) The development and initial demonstration of BRISC, a prototype next-generation nuclear reactor integrated safety code. BRISC leverages LIME to tightly couple the physics models in several different codes (written in a variety of languages) into one integrated package for simulating accident scenarios in a liquid sodium cooled 'burner' nuclear reactor. Other activities and accomplishments of the LDRD include (a) further development, application and demonstration of the 'non-linear elimination' strategy to enable physics codes that do not provide residuals to be incorporated into LIME, (b) significant extensions of the RIO CFD code capabilities, (c) complex 3D solid modeling and meshing of major fast reactor components and regions, and (d) an approach for multi-physics coupling across non-conformal mesh interfaces.

  18. CBP TOOLBOX VERSION 2.0: CODE INTEGRATION ENHANCEMENTS

    SciTech Connect

    Smith, F.; Flach, G.; BROWN, K.

    2013-06-01

    This report describes enhancements made to code integration aspects of the Cementitious Barriers Project (CBP) Toolbox as a result of development work performed at the Savannah River National Laboratory (SRNL) in collaboration with Vanderbilt University (VU) in the first half of fiscal year 2013. Code integration refers to the interfacing to standalone CBP partner codes, used to analyze the performance of cementitious materials, with the CBP Software Toolbox. The most significant enhancements are: 1) Improved graphical display of model results. 2) Improved error analysis and reporting. 3) Increase in the default maximum model mesh size from 301 to 501 nodes. 4) The ability to set the LeachXS/Orchestra simulation times through the GoldSim interface. These code interface enhancements have been included in a new release (Version 2.0) of the CBP Toolbox.

  19. Integrating Renewable Energy Requirements Into Building Energy Codes

    SciTech Connect

    Kaufmann, John R.; Hand, James R.; Halverson, Mark A.

    2011-07-01

    This report evaluates how and when to best integrate renewable energy requirements into building energy codes. The basic goals were to: (1) provide a rough guide of where we’re going and how to get there; (2) identify key issues that need to be considered, including a discussion of various options with pros and cons, to help inform code deliberations; and (3) to help foster alignment among energy code-development organizations. The authors researched current approaches nationally and internationally, conducted a survey of key stakeholders to solicit input on various approaches, and evaluated the key issues related to integration of renewable energy requirements and various options to address those issues. The report concludes with recommendations and a plan to engage stakeholders. This report does not evaluate whether the use of renewable energy should be required on buildings; that question involves a political decision that is beyond the scope of this report.

  20. Faculty and Academic Integrity: The Influence of Current Honor Codes and Past Honor Code Experiences.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Butterfield, Kenneth D.; Trevino, Linda Klebe

    2003-01-01

    Found that faculty at honor-code schools have more positive attitudes toward their schools' academic integrity policies and allow the system to take care of monitoring and disciplinary activities. Faculty in noncode institutions have less positive attitudes and are more likely to take personal actions designed to deal with cheaters. Faculty in…

  1. Second Generation Integrated Composite Analyzer (ICAN) Computer Code

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.; Ginty, Carol A.; Sanfeliz, Jose G.

    1993-01-01

    This manual updates the original 1986 NASA TP-2515, Integrated Composite Analyzer (ICAN) Users and Programmers Manual. The various enhancements and newly added features are described to enable the user to prepare the appropriate input data to run this updated version of the ICAN code. For reference, the micromechanics equations are provided in an appendix and should be compared to those in the original manual for modifications. A complete output for a sample case is also provided in a separate appendix. The input to the code includes constituent material properties, factors reflecting the fabrication process, and laminate configuration. The code performs micromechanics, macromechanics, and laminate analyses, including the hygrothermal response of polymer-matrix-based fiber composites. The output includes the various ply and composite properties, the composite structural response, and the composite stress analysis results with details on failure. The code is written in FORTRAN 77 and can be used efficiently as a self-contained package (or as a module) in complex structural analysis programs. The input-output format has changed considerably from the original version of ICAN and is described extensively through the use of a sample problem.

  2. Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics

    NASA Astrophysics Data System (ADS)

    Takabe, Hideaki

    2016-10-01

    This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (ID & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.

  3. Hydrodynamic Instability, Integrated Code, Laboratory Astrophysics, and Astrophysics

    NASA Astrophysics Data System (ADS)

    Takabe, Hideaki

    This is an article for the memorial lecture of Edward Teller Medal and is presented as memorial lecture at the IFSA03 conference held on September 12th, 2003, at Monterey, CA. The author focuses on his main contributions to fusion science and its extension to astrophysics in the field of theory and computation by picking up five topics. The first one is the anomalous resisitivity to hot electrons penetrating over-dense region through the ion wave turbulence driven by the return current compensating the current flow by the hot electrons. It is concluded that almost the same value of potential as the average kinetic energy of the hot electrons is realized to prevent the penetration of the hot electrons. The second is the ablative stabilization of Rayleigh-Taylor instability at ablation front and its dispersion relation so-called Takabe formula. This formula gave a principal guideline for stable target design. The author has developed an integrated code ILESTA (1D & 2D) for analyses and design of laser produced plasma including implosion dynamics. It is also applied to design high gain targets. The third is the development of the integrated code ILESTA. The forth is on Laboratory Astrophysics with intense lasers. This consists of two parts; one is review on its historical background and the other is on how we relate laser plasma to wide-ranging astrophysics and the purposes for promoting such research. In relation to one purpose, I gave a comment on anomalous transport of relativistic electrons in Fast Ignition laser fusion scheme. Finally, I briefly summarize recent activity in relation to application of the author's experience to the development of an integrated code for studying extreme phenomena in astrophysics.

  4. Automated Code Engine for Graphical Processing Units: Application to the Effective Core Potential Integrals and Gradients.

    PubMed

    Song, Chenchen; Wang, Lee-Ping; Martínez, Todd J

    2016-01-12

    We present an automated code engine (ACE) that automatically generates optimized kernels for computing integrals in electronic structure theory on a given graphical processing unit (GPU) computing platform. The code generator in ACE creates multiple code variants with different memory and floating point operation trade-offs. A graph representation is created as the foundation of the code generation, which allows the code generator to be extended to various types of integrals. The code optimizer in ACE determines the optimal code variant and GPU configurations for a given GPU computing platform by scanning over all possible code candidates and then choosing the best-performing code candidate for each kernel. We apply ACE to the optimization of effective core potential integrals and gradients. It is observed that the best code candidate varies with differing angular momentum, floating point precision, and type of GPU being used, which shows that the ACE may be a powerful tool in adapting to fast evolving GPU architectures.

  5. Committed to the Honor Code: An Investment Model Analysis of Academic Integrity

    ERIC Educational Resources Information Center

    Dix, Emily L.; Emery, Lydia F.; Le, Benjamin

    2014-01-01

    Educators worldwide face challenges surrounding academic integrity. The development of honor codes can promote academic integrity, but understanding how and why honor codes affect behavior is critical to their successful implementation. To date, research has not examined how students' "relationship" to an honor code predicts…

  6. Interfacing modules for integrating discipline specific structural mechanics codes

    NASA Technical Reports Server (NTRS)

    Endres, Ned M.

    1989-01-01

    An outline of the organization and capabilities of the Engine Structures Computational Simulator (Simulator) at NASA Lewis Research Center is given. One of the goals of the research at Lewis is to integrate various discipline specific structural mechanics codes into a software system which can be brought to bear effectively on a wide range of engineering problems. This system must possess the qualities of being effective and efficient while still remaining user friendly. The simulator was initially designed for the finite element simulation of gas jet engine components. Currently, the simulator has been restricted to only the analysis of high pressure turbine blades and the accompanying rotor assembly, although the current installation can be expanded for other applications. The simulator presently assists the user throughout its procedures by performing information management tasks, executing external support tasks, organizing analysis modules and executing these modules in the user defined order while maintaining processing continuity.

  7. Integration of the QMSFRG Database into the HZETRN Code

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Shavers, M. R.; Tripathi, R. K.; Wilson, J. W.

    2001-01-01

    Accurate nuclear interaction data bases are needed for describing the transport of space radiation in matter including space craft structures, atmospheres, and tissues. Transport models support the identification and development of new material concepts for human and electronic part protection. Quantum effects are manifested in nuclear reactions in several ways including interference effects between terms in the multiple scattering series, the many-body nuclear wave functions (for e.g. the roles of shell structure and Fermi momentum) and nuclear clustering. The quantum multiple scattering fragmentation model (QMSFRG) is a comprehensive model for generating nuclear interaction databases for galactic cosmic ray (GCR) transport. Other nuclear databases including the NUCFRG model and Monte-Carlo simulation codes such as FLUKA, LAHET, HETC, and GEANT ignore quantum effects. These codes fail to describe many important features of nuclear reactions and are thus inaccurate for the evaluation of materials for radiation protection. Previously we have shown that quantum effects are manifested through constructive interference in forward production spectra, the effects of Fermi momentum on production spectra, cluster nuclei knockout, and the nuclear response function. Quantum effects are especially important for heavy ions with mass numbers less than 20 that dominate radiation transport in human tissues and for the materials that are expected to be superior in space radiation protection. We describe the integration of the QMSFRG model into the HZETRN transport code. Integration milestones include proper treatment of odd-even charge-mass effects in nuclear fragmentation and the momentum distribution of nucleon production from GCR primary heavy ions. We have also modified the two-body amplitudes in the model to include nuclear medium effects. In order to include a comprehensive description of the GCR isotopic composition in materials, we have described the isotopic composition

  8. Code System to Calculate Integral Parameters with Reaction Rates from WIMS Output.

    SciTech Connect

    LESZCZYNSKI, FRANCISCO

    1994-10-25

    Version 00 REACTION calculates different integral parameters related to neutron reactions on reactor lattices, from reaction rates calculated with WIMSD4 code, and comparisons with experimental values.

  9. HCPCS Coding: An Integral Part of Your Reimbursement Strategy

    PubMed Central

    Nusgart, Marcia

    2013-01-01

    The first step to a successful reimbursement strategy is to ensure that your wound care product has the most appropriate Healthcare Common Procedure Coding System (HCPCS) code (or billing) for your product. The correct HCPCS code plays an essential role in patient access to new and existing technologies. When devising a strategy to obtain a HCPCS code for its product, companies must consider a number of factors as follows: (1) Has the product gone through the Food and Drug Administration (FDA) regulatory process or does it need to do so? Will the FDA code designation impact which HCPCS code will be assigned to your product? (2) In what “site of service” do you intend to market your product? Where will your customers use the product? Which coding system (CPT® or HCPCS) applies to your product? (3) Does a HCPCS code for a similar product already exist? Does your product fit under the existing HCPCS code? (4) Does your product need a new HCPCS code? What is the linkage, if any, between coding, payment, and coverage for the product? Researchers and companies need to start early and place the same emphasis on a reimbursement strategy as it does on a regulatory strategy. Your reimbursement strategy staff should be involved early in the process, preferably during product research and development and clinical trial discussions. PMID:24761331

  10. HCPCS Coding: An Integral Part of Your Reimbursement Strategy.

    PubMed

    Nusgart, Marcia

    2013-12-01

    The first step to a successful reimbursement strategy is to ensure that your wound care product has the most appropriate Healthcare Common Procedure Coding System (HCPCS) code (or billing) for your product. The correct HCPCS code plays an essential role in patient access to new and existing technologies. When devising a strategy to obtain a HCPCS code for its product, companies must consider a number of factors as follows: (1) Has the product gone through the Food and Drug Administration (FDA) regulatory process or does it need to do so? Will the FDA code designation impact which HCPCS code will be assigned to your product? (2) In what "site of service" do you intend to market your product? Where will your customers use the product? Which coding system (CPT(®) or HCPCS) applies to your product? (3) Does a HCPCS code for a similar product already exist? Does your product fit under the existing HCPCS code? (4) Does your product need a new HCPCS code? What is the linkage, if any, between coding, payment, and coverage for the product? Researchers and companies need to start early and place the same emphasis on a reimbursement strategy as it does on a regulatory strategy. Your reimbursement strategy staff should be involved early in the process, preferably during product research and development and clinical trial discussions.

  11. Pre-Service Teachers' Perception of Quick Response (QR) Code Integration in Classroom Activities

    ERIC Educational Resources Information Center

    Ali, Nagla; Santos, Ieda M.; Areepattamannil, Shaljan

    2017-01-01

    Quick Response (QR) codes have been discussed in the literature as adding value to teaching and learning. Despite their potential in education, more research is needed to inform practice and advance knowledge in this field. This paper investigated the integration of the QR code in classroom activities and the perceptions of the integration by…

  12. Boltzmann Transport Code Update: Parallelization and Integrated Design Updates

    NASA Technical Reports Server (NTRS)

    Heinbockel, J. H.; Nealy, J. E.; DeAngelis, G.; Feldman, G. A.; Chokshi, S.

    2003-01-01

    The on going efforts at developing a web site for radiation analysis is expected to result in an increased usage of the High Charge and Energy Transport Code HZETRN. It would be nice to be able to do the requested calculations quickly and efficiently. Therefore the question arose, "Could the implementation of parallel processing speed up the calculations required?" To answer this question two modifications of the HZETRN computer code were created. The first modification selected the shield material of Al(2219) , then polyethylene and then Al(2219). The modified Fortran code was labeled 1SSTRN.F. The second modification considered the shield material of CO2 and Martian regolith. This modified Fortran code was labeled MARSTRN.F.

  13. Neutronic calculation of fast reactors by the EUCLID/V1 integrated code

    NASA Astrophysics Data System (ADS)

    Koltashev, D. A.; Stakhanova, A. A.

    2017-01-01

    This article considers neutronic calculation of a fast-neutron lead-cooled reactor BREST-OD-300 by the EUCLID/V1 integrated code. The main goal of development and application of integrated codes is a nuclear power plant safety justification. EUCLID/V1 is integrated code designed for coupled neutronics, thermomechanical and thermohydraulic fast reactor calculations under normal and abnormal operating conditions. EUCLID/V1 code is being developed in the Nuclear Safety Institute of the Russian Academy of Sciences. The integrated code has a modular structure and consists of three main modules: thermohydraulic module HYDRA-IBRAE/LM/V1, thermomechanical module BERKUT and neutronic module DN3D. In addition, the integrated code includes databases with fuel, coolant and structural materials properties. Neutronic module DN3D provides full-scale simulation of neutronic processes in fast reactors. Heat sources distribution, control rods movement, reactivity level changes and other processes can be simulated. Neutron transport equation in multigroup diffusion approximation is solved. This paper contains some calculations implemented as a part of EUCLID/V1 code validation. A fast-neutron lead-cooled reactor BREST-OD-300 transient simulation (fuel assembly floating, decompression of passive feedback system channel) and cross-validation with MCU-FR code results are presented in this paper. The calculations demonstrate EUCLID/V1 code application for BREST-OD-300 simulating and safety justification.

  14. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    SciTech Connect

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of the accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).

  15. SPACE code simulation of cold leg small break LOCA in the ATLAS integral test

    SciTech Connect

    Kim, B. J.; Kim, H. T.; Kim, J.; Kim, K. D.

    2012-07-01

    SPACE code is a system analysis code for pressurized water reactors. This code uses a two-fluid and three-field model. For a few years, intensive validations have been performed to secure the prediction accuracy of models and correlations for two-phase flow and heat transfer. Recently, the code version 1.0 was released. This study is to see how well SPACE code predicts thermal hydraulic phenomena of an integral effect test. The target experiment is a cold leg small break LOCA in the ATLAS facility, which has the same two-loop features as APR1400. Predicted parameters were compared with experimental observations. (authors)

  16. A general multiblock Euler code for propulsion integration. Volume 3: User guide for the Euler code

    NASA Technical Reports Server (NTRS)

    Chen, H. C.; Su, T. Y.; Kao, T. J.

    1991-01-01

    This manual explains the procedures for using the general multiblock Euler (GMBE) code developed under NASA contract NAS1-18703. The code was developed for the aerodynamic analysis of geometrically complex configurations in either free air or wind tunnel environments (vol. 1). The complete flow field is divided into a number of topologically simple blocks within each of which surface fitted grids and efficient flow solution algorithms can easily be constructed. The multiblock field grid is generated with the BCON procedure described in volume 2. The GMBE utilizes a finite volume formulation with an explicit time stepping scheme to solve the Euler equations. A multiblock version of the multigrid method was developed to accelerate the convergence of the calculations. This user guide provides information on the GMBE code, including input data preparations with sample input files and a sample Unix script for program execution in the UNICOS environment.

  17. Talking about Code: Integrating Pedagogical Code Reviews into Early Computing Courses

    ERIC Educational Resources Information Center

    Hundhausen, Christopher D.; Agrawal, Anukrati; Agarwal, Pawan

    2013-01-01

    Given the increasing importance of soft skills in the computing profession, there is good reason to provide students withmore opportunities to learn and practice those skills in undergraduate computing courses. Toward that end, we have developed an active learning approach for computing education called the "Pedagogical Code Review"…

  18. Deciphering the Code for Retroviral Integration Target Site Selection

    PubMed Central

    Santoni, Federico Andrea; Hartley, Oliver; Luban, Jeremy

    2010-01-01

    Upon cell invasion, retroviruses generate a DNA copy of their RNA genome and integrate retroviral cDNA within host chromosomal DNA. Integration occurs throughout the host cell genome, but target site selection is not random. Each subgroup of retrovirus is distinguished from the others by attraction to particular features on chromosomes. Despite extensive efforts to identify host factors that interact with retrovirion components or chromosome features predictive of integration, little is known about how integration sites are selected. We attempted to identify markers predictive of retroviral integration by exploiting Precision-Recall methods for extracting information from highly skewed datasets to derive robust and discriminating measures of association. ChIPSeq datasets for more than 60 factors were compared with 14 retroviral integration datasets. When compared with MLV, PERV or XMRV integration sites, strong association was observed with STAT1, acetylation of H3 and H4 at several positions, and methylation of H2AZ, H3K4, and K9. By combining peaks from ChIPSeq datasets, a supermarker was identified that localized within 2 kB of 75% of MLV proviruses and detected differences in integration preferences among different cell types. The supermarker predicted the likelihood of integration within specific chromosomal regions in a cell-type specific manner, yielding probabilities for integration into proto-oncogene LMO2 identical to experimentally determined values. The supermarker thus identifies chromosomal features highly favored for retroviral integration, provides clues to the mechanism by which retrovirus integration sites are selected, and offers a tool for predicting cell-type specific proto-oncogene activation by retroviruses. PMID:21124862

  19. Development of an integrated thermal-hydraulics capability incorporating RELAP5 and PANTHER neutronics code

    SciTech Connect

    Page, R.; Jones, J.R.

    1997-07-01

    Ensuring that safety analysis needs are met in the future is likely to lead to the development of new codes and the further development of existing codes. It is therefore advantageous to define standards for data interfaces and to develop software interfacing techniques which can readily accommodate changes when they are made. Defining interface standards is beneficial but is necessarily restricted in application if future requirements are not known in detail. Code interfacing methods are of particular relevance with the move towards automatic grid frequency response operation where the integration of plant dynamic, core follow and fault study calculation tools is considered advantageous. This paper describes the background and features of a new code TALINK (Transient Analysis code LINKage program) used to provide a flexible interface to link the RELAP5 thermal hydraulics code with the PANTHER neutron kinetics and the SIBDYM whole plant dynamic modelling codes used by Nuclear Electric. The complete package enables the codes to be executed in parallel and provides an integrated whole plant thermal-hydraulics and neutron kinetics model. In addition the paper discusses the capabilities and pedigree of the component codes used to form the integrated transient analysis package and the details of the calculation of a postulated Sizewell `B` Loss of offsite power fault transient.

  20. Experimental assessment of computer codes used for safety analysis of integral reactors

    SciTech Connect

    Falkov, A.A.; Kuul, V.S.; Samoilov, O.B.

    1995-09-01

    Peculiarities of integral reactor thermohydraulics in accidents are associated with presence of noncondensable gas in built-in pressurizer, absence of pumped ECCS, use of guard vessel for LOCAs localisation and passive RHRS through in-reactor HX`s. These features defined the main trends in experimental investigations and verification efforts for computer codes applied. The paper reviews briefly the performed experimental investigation of thermohydraulics of AST-500, VPBER600-type integral reactors. The characteristic of UROVEN/MB-3 code for LOCAs analysis in integral reactors and results of its verification are given. The assessment of RELAP5/mod3 applicability for accident analysis in integral reactor is presented.

  1. EMdeCODE: a novel algorithm capable of reading words of epigenetic code to predict enhancers and retroviral integration sites and to identify H3R2me1 as a distinctive mark of coding versus non-coding genes

    PubMed Central

    Santoni, Federico Andrea

    2013-01-01

    Existence of some extra-genetic (epigenetic) codes has been postulated since the discovery of the primary genetic code. Evident effects of histone post-translational modifications or DNA methylation over the efficiency and the regulation of DNA processes are supporting this postulation. EMdeCODE is an original algorithm that approximate the genomic distribution of given DNA features (e.g. promoter, enhancer, viral integration) by identifying relevant ChIPSeq profiles of post-translational histone marks or DNA binding proteins and combining them in a supermark. EMdeCODE kernel is essentially a two-step procedure: (i) an expectation-maximization process calculates the mixture of epigenetic factors that maximize the Sensitivity (recall) of the association with the feature under study; (ii) the approximated density is then recursively trimmed with respect to a control dataset to increase the precision by reducing the number of false positives. EMdeCODE densities improve significantly the prediction of enhancer loci and retroviral integration sites with respect to previous methods. Importantly, it can also be used to extract distinctive factors between two arbitrary conditions. Indeed EMdeCODE identifies unexpected epigenetic profiles specific for coding versus non-coding RNA, pointing towards a new role for H3R2me1 in coding regions. PMID:23234700

  2. Transport analysis in toroidal helical plasmas using the integrated code: TASK3D

    NASA Astrophysics Data System (ADS)

    Wakasa, A.; Fukuyama, A.; Murakami, S.; Beidler, C. D.; Maassberg, H.; Yokoyama, M.; Sato, M.

    2009-11-01

    The integrated simulation code in helical plasmas, TASK3D, is being developed on the basis of an integrated modeling code for tokamak plasma, TASK. In helical systems, the neoclassical transport is one of the important issues in addition to the anomalous transport, because of strong temperature dependence of heat conductivity and an important role in determining the radial electric field. We have already constructed the neoclassical transport database in LHD, DGN/LHD. The mono-energetic diffusion coefficients are evaluated based on the Monte Carlo method by DCOM code and the mono-energetic diffusion coefficients database is constructed using a neural network technique. Also we apply GSRAKE code, which solves the ripple-averaged drift kinetic equation, to obtain transport coefficients in highly collisionless regime. We have newly incorporated the DGN/LHD module into TASK3D. We will present several results of transport simulation in typical LHD plasmas.

  3. Final Report. An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group

    SciTech Connect

    Rosenthal, Andrew

    2013-12-30

    The DOE grant, “An Integrated Partnership to Create and Lead the Solar Codes and Standards Working Group,” to New Mexico State University created the Solar America Board for Codes and Standards (Solar ABCs). From 2007 – 2013 with funding from this grant, Solar ABCs identified current issues, established a dialogue among key stakeholders, and catalyzed appropriate activities to support the development of codes and standards that facilitated the installation of high quality, safe photovoltaic systems. Solar ABCs brought the following resources to the PV stakeholder community; Formal coordination in the planning or revision of interrelated codes and standards removing “stove pipes” that have only roofing experts working on roofing codes, PV experts on PV codes, fire enforcement experts working on fire codes, etc.; A conduit through which all interested stakeholders were able to see the steps being taken in the development or modification of codes and standards and participate directly in the processes; A central clearing house for new documents, standards, proposed standards, analytical studies, and recommendations of best practices available to the PV community; A forum of experts that invites and welcomes all interested parties into the process of performing studies, evaluating results, and building consensus on standards and code-related topics that affect all aspects of the market; and A biennial gap analysis to formally survey the PV community to identify needs that are unmet and inhibiting the market and necessary technical developments.

  4. Design and realization of the miniature long-life integrative coded sun sensor

    NASA Astrophysics Data System (ADS)

    Mo, Yanan; Cui, Jian; Zhao, Yuan; Chen, Ran; Liu, Xin

    2013-10-01

    This paper describes the research activity at the Beijing institute of control engineering about the miniature long-life integrative coded sun sensor. The light system of the miniature coded sun sensor is composed with a semi-column silex glass, a cube silex with coded shape on the bottom and an integrative silicon battery with 14 cells. The sun line forms a light spot through the slit on light system on the coded plate. The sensor determines the orientation of sun through the position of light spot. With the limitation of the diameter of sun plate the accuracy of only 0.5° can be realized with 8-bit coarse code in FOV of 124°. To achieve high accuracy of 0.05° the subdivision technique must be adopted. The main scheme of the miniature long-life integrative coded sun sensor is integrating the light system and the signal processing circuits in one mechanical house, using FPGA to calculate the angle, generate the control signal of Multiplexer and AD and realize the function of UART, using flexibility board to connect analog board and digital board, using second power of the satellite, using RS422 interface to communicate with central computer. The performance of the miniature long-life integrative coded sun sensor is listed as below : FOV 124°x124°,accuracy 0.05°(3σ), resolution 14″, power consumption 0.5W,update rate 40Hz,mass 475g, designed life-time 15 years. It has been adopted in the new platform of Remote Sensing Satellite of CAST. The first flight will be at 2015.

  5. Nuclear Energy Advanced Modeling and Simulation (NEAMS) Waste Integrated Performance and Safety Codes (IPSC) : FY10 development and integration.

    SciTech Connect

    Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.; Dewers, Thomas A.; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Wang, Yifeng; Schultz, Peter Andrew

    2011-02-01

    This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.

  6. An Integrated Program Structure and System of Account Codes for PPBS in Local School Districts.

    ERIC Educational Resources Information Center

    Miller, Donald R.

    This monograph presents a comprehensive but tentative matrix program structure and system of account codes that have been integrated to facilitate the implementation of PPB systems in local school districts. It is based on the results of an extensive analysis of K-12 public school district programs and management practices. In its entirety, the…

  7. A long-term, integrated impact assessment of alternative building energy code scenarios in China

    SciTech Connect

    Yu, Sha; Eom, Jiyong; Evans, Meredydd; Clarke, Leon E.

    2014-04-01

    China is the second largest building energy user in the world, ranking first and third in residential and commercial energy consumption. Beginning in the early 1980s, the Chinese government has developed a variety of building energy codes to improve building energy efficiency and reduce total energy demand. This paper studies the impact of building energy codes on energy use and CO2 emissions by using a detailed building energy model that represents four distinct climate zones each with three building types, nested in a long-term integrated assessment framework GCAM. An advanced building stock module, coupled with the building energy model, is developed to reflect the characteristics of future building stock and its interaction with the development of building energy codes in China. This paper also evaluates the impacts of building codes on building energy demand in the presence of economy-wide carbon policy. We find that building energy codes would reduce Chinese building energy use by 13% - 22% depending on building code scenarios, with a similar effect preserved even under the carbon policy. The impact of building energy codes shows regional and sectoral variation due to regionally differentiated responses of heating and cooling services to shell efficiency improvement.

  8. Users Guide to SAMINT: A Code for Nuclear Data Adjustment with SAMMY Based on Integral Experiments

    SciTech Connect

    Sobes, Vladimir; Leal, Luiz C.; Arbanas, Goran

    2014-10-01

    The purpose of this project is to couple differential and integral data evaluation in a continuous-energy framework. More specifically, the goal is to use the Generalized Linear Least Squares methodology employed in TSURFER to update the parameters of a resolved resonance region evaluation directly. Recognizing that the GLLS methodology in TSURFER is identical to the mathematical description of the simple Bayesian updating carried out in SAMMY, the computer code SAMINT was created to help use the mathematical machinery of SAMMY to update resolved resonance parameters based on integral data. Minimal modifications of SAMMY are required when used with SAMINT to make resonance parameter updates based on integral experimental data.

  9. Semantic Integration and Age of Acquisition Effects in Code-Blend Comprehension

    PubMed Central

    Emmorey, Karen

    2016-01-01

    Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust facilitation effects were observed for semantic decision than for lexical decision, suggesting that lexical integration of signs and words within a code-blend occurs primarily at the semantic level, rather than at the level of form. Early bilinguals exhibited greater facilitation effects than late bilinguals for English (the dominant language) in the semantic decision task, possibly because early bilinguals are better able to process early visual cues from ASL signs and use these to constrain English word recognition. Comprehension facilitation via semantic integration of words and signs is consistent with co-speech gesture research demonstrating facilitative effects of gesture integration on language comprehension. PMID:26657077

  10. An Integration of the Restructured Melcor for the Midas Computer Code

    SciTech Connect

    Sunhee Park; Dong Ha Kim; Ko-Ryu Kim; Song-Won Cho

    2006-07-01

    The developmental need for a localized severe accident analysis code is on the rise. KAERI is developing a severe accident code called MIDAS, which is based on MELCOR. In order to develop the localized code (MIDAS) which simulates a severe accident in a nuclear power plant, the existing data structure is reconstructed for all the packages in MELCOR, which uses pointer variables for data transfer between the packages. During this process, new features in FORTRAN90 such as a dynamic allocation are used for an improved data saving and transferring method. Hence the readability, maintainability and portability of the MIDAS code have been enhanced. After the package-wise restructuring, the newly converted packages are integrated together. Depending on the data usage in the package, two types of packages can be defined: some use their own data within the package (let's call them independent packages) and the others share their data with other packages (dependent packages). For the independent packages, the integration process is simple to link the already converted packages together. That is, the package-wise structuring does not require further conversion of variables for the integration process. For the dependent packages, extra conversion is necessary to link them together. As the package-wise restructuring converts only the corresponding package's variables, other variables defined from other packages are not touched and remain as it is. These variables are to be converted into the new types of variables simultaneously as well as the main variables in the corresponding package. Then these dependent packages are ready for integration. In order to check whether the integration process is working well, the results from the integrated version are verified against the package-wise restructured results. Steady state runs and station blackout sequences are tested and the major variables are found to be the same each other. In order to verify the results, the integrated

  11. TOPICAL REVIEW: The CRONOS suite of codes for integrated tokamak modelling

    NASA Astrophysics Data System (ADS)

    Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Schneider, M.; Garcia, J.; Giruzzi, G.; Huynh, P.; Aniel, T.; Albajar, F.; Ané, J. M.; Bécoulet, A.; Bourdelle, C.; Casati, A.; Colas, L.; Decker, J.; Dumont, R.; Eriksson, L. G.; Garbet, X.; Guirlet, R.; Hertout, P.; Hoang, G. T.; Houlberg, W.; Huysmans, G.; Joffrin, E.; Kim, S. H.; Köchl, F.; Lister, J.; Litaudon, X.; Maget, P.; Masset, R.; Pégourié, B.; Peysson, Y.; Thomas, P.; Tsitrone, E.; Turco, F.

    2010-04-01

    CRONOS is a suite of numerical codes for the predictive/interpretative simulation of a full tokamak discharge. It integrates, in a modular structure, a 1D transport solver with general 2D magnetic equilibria, several heat, particle and impurities transport models, as well as heat, particle and momentum sources. This paper gives a first comprehensive description of the CRONOS suite: overall structure of the code, main available models, details on the simulation workflow and numerical implementation. Some examples of applications to the analysis of experimental discharges and the predictions of ITER scenarios are also given.

  12. Integration of a supersonic unsteady aerodynamic code into the NASA FASTEX system

    NASA Technical Reports Server (NTRS)

    Appa, Kari; Smith, Michael J. C.

    1987-01-01

    A supersonic unsteady aerodynamic loads prediction method based on the constant pressure method was integrated into the NASA FASTEX system. The updated FASTEX code can be employed for aeroelastic analyses in subsonic and supersonic flow regimes. A brief description of the supersonic constant pressure panel method, as applied to lifting surfaces and body configurations, is followed by a documentation of updates required to incorporate this method in the FASTEX code. Test cases showing correlations of predicted pressure distributions, flutter solutions, and stability derivatives with available data are reported.

  13. Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Trevino, Linda Klebe; Butterfield, Kenneth D.

    2002-01-01

    Investigated the influence of modified honor codes, an alternative to traditional codes that is gaining popularity on larger campuses. Also tested the model of student academic dishonesty previously suggested by McCabe and Trevino. Found that modified honor codes are associated with lower levels of student dishonesty and that the McCabe Trevino…

  14. The Integral PWR SIR Transients: Comparisons Between CATHARE and RELAP Codes

    SciTech Connect

    Pignatel, Jean-Francois

    2002-07-01

    Within the framework of the research program on innovative light water reactors, the SERI (Service of Studies on Innovative Reactors) of the French Atomic Energy Commission (CEA), is presenting a predictive study on the modeling of a low-power integral Pressurized Water Reactor, using the CATHARE thermalhydraulic code. The concept selected for this study is that of the SIR reactor project, developed by AEA-T and ABB consortium. This very interesting concept is no doubt that which is the most complete to this date, and on which most information in the literature can be obtained. Many safety calculations made with the RELAP code are also available and represent a highly interesting base for comparison purposes, in order to improve the approach on the results obtained with CATHARE. A comparison of the behavior of the two codes is thus presented in this article. This study therefore shows that CATHARE finely models this type of new PWR concept. The transients studied cover a large area, ranging from natural circulation to loss of primary coolant accidents. The ATWS and a power transient have also been calculated. The comparison made between the CATHARE and RELAP results shows a very good agreement between the two codes, and leads to a very positive conclusion on the pertinence of simulating an integral PWR. Moreover, even though this study is a thorough investigation on the subject, it confirms the potentially safe nature of the SIR reactor. (author)

  15. Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC) : gap analysis for high fidelity and performance assessment code development.

    SciTech Connect

    Lee, Joon H.; Siegel, Malcolm Dean; Arguello, Jose Guadalupe, Jr.; Webb, Stephen Walter; Dewers, Thomas A.; Mariner, Paul E.; Edwards, Harold Carter; Fuller, Timothy J.; Freeze, Geoffrey A.; Jove-Colon, Carlos F.; Wang, Yifeng

    2011-03-01

    This report describes a gap analysis performed in the process of developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with rigorous verification, validation, and software quality requirements. The gap analyses documented in this report were are performed during an initial gap analysis to identify candidate codes and tools to support the development and integration of the Waste IPSC, and during follow-on activities that delved into more detailed assessments of the various codes that were acquired, studied, and tested. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. The gap analysis indicates that significant capabilities may already exist in the existing THC codes although there is no single code able to fully account for all physical and chemical processes involved in a waste disposal system. Large gaps exist in modeling chemical processes and their couplings with other processes. The coupling of chemical processes with flow transport and mechanical deformation remains challenging. The data for extreme environments (e.g., for elevated temperature and high ionic strength media) that are

  16. Applicability of GALE-86 Codes to Integral Pressurized Water Reactor designs

    SciTech Connect

    Geelhood, Kenneth J.; Rishel, Jeremy P.

    2012-06-01

    This report describes work that Pacific Northwest National Laboratory is doing to assist the U.S. Nuclear Regulatory Commission (NRC) Office of New Reactors (NRO) staff in their reviews of applications for nuclear power plants using new reactor core designs. These designs include small integral PWRs (IRIS, mPower, and NuScale reactor designs), HTGRs, (pebble-bed and prismatic-block modular reactor designs) and SFRs (4S and PRISM reactor designs). Under this specific task, PNNL will assist the NRC staff in reviewing the current versions of the GALE codes and identify features and limitations that would need to be modified to accommodate the technical review of iPWR and mPower® license applications and recommend specific changes to the code, NUREG-0017, and associated NRC guidance. This contract is necessary to support the licensing of iPWRs with a near-term focus on the B&W mPower® reactor design. While the focus of this review is on the mPower® reactor design, the review of the code and the scope of recommended changes consider a revision of the GALE codes that would make them universally applicable for other types of integral PWR designs. The results of a detailed comparison between PWR and iPWR designs are reported here. Also included is an investigation of the GALE code and its basis and a determination as to the applicability of each of the bases to an iPWR design. The issues investigated come from a list provided by NRC staff, the results of comparing the PWR and iPWR designs, the parameters identified as having a large impact on the code outputs from a recent sensitivity study and the main bases identified in NUREG-0017. This report will provide a summary of the gaps in the GALE codes as they relate to iPWR designs and for each gap will propose what work could be performed to fill that gap and create a version of GALE that is applicable to integral PWR designs.

  17. Integrated TIGER Series of Coupled Electron/Photon Monte Carlo Transport Codes System.

    SciTech Connect

    VALDEZ, GREG D.

    2012-11-30

    Version: 00 Distribution is restricted to US Government Agencies and Their Contractors Only. The Integrated Tiger Series (ITS) is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. The goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 95. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  18. Leap frog integrator modifications in highly collisional particle-in-cell codes

    NASA Astrophysics Data System (ADS)

    Hanzlikova, N.; Turner, M. M.

    2014-07-01

    Leap frog integration method is a standard, simple, fast, and accurate way to implement velocity and position integration in particle-in-cell codes. Due to the direct solution of kinetics of particles in phase space central to the particle-in-cell procedure, important information can be obtained on particle velocity distributions, and consequently on transport and heating processes. This approach is commonly associated with physical situations where collisional effects are weak, but can also be profitably applied in some highly collisional cases, such as occur in semiconductor devices and gaseous discharges at atmospheric pressure. In this paper, we show that the implementation of the leap frog integration method in these circumstances can violate some of the assumptions central to the accuracy of this scheme. Indeed, without adaptation, the method gives incorrect results. We show here how the method must be modified to deal correctly with highly collisional cases.

  19. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC).

    SciTech Connect

    Schultz, Peter Andrew

    2011-12-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. Achieving the objective of modeling the performance of a disposal scenario requires describing processes involved in waste form degradation and radionuclide release at the subcontinuum scale, beginning with mechanistic descriptions of chemical reactions and chemical kinetics at the atomic scale, and upscaling into effective, validated constitutive models for input to high-fidelity continuum scale codes for coupled multiphysics simulations of release and transport. Verification and validation (V&V) is required throughout the system to establish evidence-based metrics for the level of confidence in M&S codes and capabilities, including at the subcontiunuum scale and the constitutive models they inform or generate. This Report outlines the nature of the V&V challenge at the subcontinuum scale, an approach to incorporate V&V concepts into subcontinuum scale modeling and simulation (M&S), and a plan to incrementally incorporate effective V&V into subcontinuum scale M&S destined for use in the NEAMS Waste IPSC work flow to meet requirements of quantitative confidence in the constitutive models informed by subcontinuum scale phenomena.

  20. Simulation of Jet Noise with OVERFLOW CFD Code and Kirchhoff Surface Integral

    NASA Technical Reports Server (NTRS)

    Kandula, M.; Caimi, R.; Voska, N. (Technical Monitor)

    2002-01-01

    An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.

  1. IPACS (Integrated Probabilistic Assessment of Composite Structures): Code development and applications

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Shiao, Michael C.

    1993-01-01

    A methodology and attendant computer code have been developed and are described to computationally simulate the uncertain behavior of composite structures. The uncertain behavior includes buckling loads, stress concentration factors, displacements, stress/strain etc., which are the consequences of the inherent uncertainties (scatter) in the primitive (independent random) variables (constituent, ply, laminate and structural) that describe the composite structures. The computer code, IPACS (Integrated Probabilistic Assessment of Composite Structures), can handle both composite mechanics and composite structures. Application to probabilistic composite mechanics is illustrated by its uses to evaluate the uncertainties in the major Poisson's ratio and in laminate stiffness and strength. IPACS application to probabilistic structural analysis is illustrated by its use to evaluate the uncertainties in the buckling of a composite plate, in the stress concentration factor in a composite panel and in the vertical displacement and ply stress in a composite aircraft wing segment.

  2. Simulation of Supersonic Jet Noise with the Adaptation of Overflow CFD Code and Kirchhoff Surface Integral

    NASA Technical Reports Server (NTRS)

    Kandula, Max; Caimi, Raoul; Steinrock, T. (Technical Monitor)

    2001-01-01

    An acoustic prediction capability for supersonic axisymmetric jets was developed on the basis of OVERFLOW Navier-Stokes CFD (Computational Fluid Dynamics) code of NASA Langley Research Center. Reynolds-averaged turbulent stresses in the flow field are modeled with the aid of Spalart-Allmaras one-equation turbulence model. Appropriate acoustic and outflow boundary conditions were implemented to compute time-dependent acoustic pressure in the nonlinear source-field. Based on the specification of acoustic pressure, its temporal and normal derivatives on the Kirchhoff surface, the near-field and the far-field sound pressure levels are computed via Kirchhoff surface integral, with the Kirchhoff surface chosen to enclose the nonlinear sound source region described by the CFD code. The methods are validated by a comparison of the predictions of sound pressure levels with the available data for an axisymmetric turbulent supersonic (Mach 2) perfectly expanded jet.

  3. Extension of the Integrated Tiger Series (ITS) of electron-photon Monte Carlo codes to 100 GeV

    SciTech Connect

    Miller, S.G.

    1988-08-01

    Version 2.1 of the Integrated Tiger Series (ITS) of electron-photon Monte Carlo codes was modified to extend their ability to model interactions up to 100 GeV. Benchmarks against experimental results conducted at 10 and 15 GeV confirm the accuracy of the extended codes. 12 refs., 2 figs., 2 tabs.

  4. A first-order integral method developed for the VARIANT code

    SciTech Connect

    Smith, M. A.; Lewis, E. E.; Palmiotti, G.; Yang, W. S.

    2006-07-01

    A first order nodal integral method using spherical harmonic interface conditions is formulated and implemented into VARIANT [1-3], a variational nodal transport code developed at Argonne National Laboratory. The spatial domain is split into hybrid finite elements, called nodes, where orthogonal polynomial spatial trial functions are used within each node and spatial Lagrange multipliers are used along the node boundaries. The internal angular approximation is weighted with a complete odd-order spherical harmonics set and numerically integrated using a standard angular quadrature. Along the nodal boundaries, even-order Rumyantsev interface conditions are combined with the spatial Lagrange multipliers to couple the nodes together. The new method is implemented in Cartesian geometry and used to solve a fixed source two-dimensional benchmark problem. (authors)

  5. Fitting Prompt Fission Neutron Spectra Using Kalman Filter Integrated with Empire Code

    NASA Astrophysics Data System (ADS)

    Nobre, G. P. A.; Herman, M.; Hoblit, S.; Palumbo, A.; Capote, R.; Trkov, A.

    2014-04-01

    Prompt fission neutron spectra (PFNS) have proven to have a significant effect on criticality of selected benchmarks, in some cases as important as cross-sections. Therefore, a precise determination of uncertainties in PFNS is desired. Existing PFNS evaluations in nuclear data libraries relied so far almost exclusively on the Los Alamos model. However, deviations of evaluated data from available experiments have been noticed at both low and high neutron emission energies. New experimental measurements of PFNS have been recently published, thus demanding new evaluations. The present work describes the effort of integrating Kalman and EMPIRE codes in such a way to allow for parameter fitting of PFNS models. The first results are shown for the major actinides for two different PFNS models (Kornilov and Los Alamos). This represents the first step towards reevaluation of both cross-section and fission spectra data considering both microscopic and integral experimental data for major actinides.

  6. An integrated, structure- and energy-based view of the genetic code

    PubMed Central

    Grosjean, Henri; Westhof, Eric

    2016-01-01

    The principles of mRNA decoding are conserved among all extant life forms. We present an integrative view of all the interaction networks between mRNA, tRNA and rRNA: the intrinsic stability of codon–anticodon duplex, the conformation of the anticodon hairpin, the presence of modified nucleotides, the occurrence of non-Watson–Crick pairs in the codon–anticodon helix and the interactions with bases of rRNA at the A-site decoding site. We derive a more information-rich, alternative representation of the genetic code, that is circular with an unsymmetrical distribution of codons leading to a clear segregation between GC-rich 4-codon boxes and AU-rich 2:2-codon and 3:1-codon boxes. All tRNA sequence variations can be visualized, within an internal structural and energy framework, for each organism, and each anticodon of the sense codons. The multiplicity and complexity of nucleotide modifications at positions 34 and 37 of the anticodon loop segregate meaningfully, and correlate well with the necessity to stabilize AU-rich codon–anticodon pairs and to avoid miscoding in split codon boxes. The evolution and expansion of the genetic code is viewed as being originally based on GC content with progressive introduction of A/U together with tRNA modifications. The representation we present should help the engineering of the genetic code to include non-natural amino acids. PMID:27448410

  7. Embedded Systems Hardware Integration and Code Development for Maraia Capsule and E-MIST

    NASA Technical Reports Server (NTRS)

    Carretero, Emmanuel S.

    2015-01-01

    The cost of sending large spacecraft to orbit makes them undesirable for carrying out smaller scientific missions. Small spacecraft are more economical and can be tailored for missions where specific tasks need to be carried out, the Maraia capsule is such a spacecraft. Maraia will allow for samples of experiments conducted on the International Space Station to be returned to earth. The use of balloons to conduct experiments at the edge of space is a practical approach to reducing the large expense of using rockets. E-MIST is a payload designed to fly on a high altitude balloon. It can maintain science experiments in a controlled manner at the edge of space. The work covered here entails the integration of hardware onto each of the mentioned systems and the code associated with such work. In particular, the resistance temperature detector, pressure transducers, cameras, and thrusters for Maraia are discussed. The integration of the resistance temperature detectors and motor controllers to E-MIST is described. Several issues associated with sensor accuracy, code lock-up, and in-flight reset issues are mentioned. The solutions and proposed solutions to these issues are explained.

  8. Integrating environmental goals into urban redevelopment schemes: lessons from the Code River, Yogyakarta, Indonesia.

    PubMed

    Setiawan, B B

    2002-01-01

    The settlement along the bank of the Code River in Yogyakarta, Indonesia provides housing for a large mass of the city's poor. Its strategic location and the fact that most urban poor do not have access to land, attracts people to "illegally" settle along the bank of the river. This brings negative consequences for the environment, particularly the increasing domestic waste along the river and the annual flooding in the rainy season. While the public controversies regarding the existence of the settlement along the Code River were still not resolved, at the end of the 1980s, a group of architects, academics and community members proposed the idea of constructing a dike along the River as part of a broader settlement improvement program. From 1991 to 1998, thousands of local people mobilized their resources and were able to construct 6,000 metres of riverside dike along the Code River. The construction of the riverside dike along the River has become an important "stimulant" that generated not only settlement improvement, but also a better treatment of river water. As all housing units located along the River are now facing the River, the River itself is considered the "front-yard". Before the dike was constructed, the inhabitants used to treat the River as the "backyard" and therefore just throw waste into the River. They now really want to have a cleaner river, since the River is an important part of their settlement. The settlement along the Code River presents a complex range of persistent problems with informal settlements in Indonesia; such problems are related to the issues of how to provide more affordable and adequate housing for the poor, while at the same time, to improve the water quality of the river. The project represents a good case, which shows that through a mutual partnership among stakeholders, it is possible to integrate environmental goals into urban redevelopment schemes.

  9. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines

    PubMed Central

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology’s Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1–4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities. PMID:26192805

  10. Research Integrity and Research Ethics in Professional Codes of Ethics: Survey of Terminology Used by Professional Organizations across Research Disciplines.

    PubMed

    Komić, Dubravka; Marušić, Stjepan Ljudevit; Marušić, Ana

    2015-01-01

    Professional codes of ethics are social contracts among members of a professional group, which aim to instigate, encourage and nurture ethical behaviour and prevent professional misconduct, including research and publication. Despite the existence of codes of ethics, research misconduct remains a serious problem. A survey of codes of ethics from 795 professional organizations from the Illinois Institute of Technology's Codes of Ethics Collection showed that 182 of them (23%) used research integrity and research ethics terminology in their codes, with differences across disciplines: while the terminology was common in professional organizations in social sciences (82%), mental health (71%), sciences (61%), other organizations had no statements (construction trades, fraternal social organizations, real estate) or a few of them (management, media, engineering). A subsample of 158 professional organizations we judged to be directly involved in research significantly more often had statements on research integrity/ethics terminology than the whole sample: an average of 10.4% of organizations with a statement (95% CI = 10.4-23-5%) on any of the 27 research integrity/ethics terms compared to 3.3% (95% CI = 2.1-4.6%), respectively (P<0.001). Overall, 62% of all statements addressing research integrity/ethics concepts used prescriptive language in describing the standard of practice. Professional organizations should define research integrity and research ethics issues in their ethics codes and collaborate within and across disciplines to adequately address responsible conduct of research and meet contemporary needs of their communities.

  11. Slow Temporal Integration Enables Robust Neural Coding and Perception of a Cue to Sound Source Location

    PubMed Central

    Tollin, Daniel J.

    2016-01-01

    In mammals, localization of sound sources in azimuth depends on sensitivity to interaural differences in sound timing (ITD) and level (ILD). Paradoxically, while typical ILD-sensitive neurons of the auditory brainstem require millisecond synchrony of excitatory and inhibitory inputs for the encoding of ILDs, human and animal behavioral ILD sensitivity is robust to temporal stimulus degradations (e.g., interaural decorrelation due to reverberation), or, in humans, bilateral clinical device processing. Here we demonstrate that behavioral ILD sensitivity is only modestly degraded with even complete decorrelation of left- and right-ear signals, suggesting the existence of a highly integrative ILD-coding mechanism. Correspondingly, we find that a majority of auditory midbrain neurons in the central nucleus of the inferior colliculus (of chinchilla) effectively encode ILDs despite complete decorrelation of left- and right-ear signals. We show that such responses can be accounted for by relatively long windows of bilateral excitatory-inhibitory interaction, which we explicitly measure using trains of narrowband clicks. Neural and behavioral data are compared with the outputs of a simple model of ILD processing with a single free parameter, the duration of excitatory-inhibitory interaction. Behavioral, neural, and modeling data collectively suggest that ILD sensitivity depends on binaural integration of excitation and inhibition within a ≳3 ms temporal window, significantly longer than observed in lower brainstem neurons. This relatively slow integration potentiates a unique role for the ILD system in spatial hearing that may be of particular importance when informative ITD cues are unavailable. SIGNIFICANCE STATEMENT In mammalian hearing, interaural differences in the timing (ITD) and level (ILD) of impinging sounds carry critical information about source location. However, natural sounds are often decorrelated between the ears by reverberation and background noise

  12. Segregated and integrated coding of reward and punishment in the cingulate cortex.

    PubMed

    Fujiwara, Juri; Tobler, Philippe N; Taira, Masato; Iijima, Toshio; Tsutsui, Ken-Ichiro

    2009-06-01

    Reward and punishment have opposite affective value but are both processed by the cingulate cortex. However, it is unclear whether the positive and negative affective values of monetary reward and punishment are processed by separate or common subregions of the cingulate cortex. We performed a functional magnetic resonance imaging study using a free-choice task and compared cingulate activations for different levels of monetary gain and loss. Gain-specific activation (increasing activation for increasing gain, but no activation change in relation to loss) occurred mainly in the anterior part of the anterior cingulate and in the posterior cingulate cortex. Conversely, loss-specific activation (increasing activation for increasing loss, but no activation change in relation to gain) occurred between these areas, in the middle and posterior part of the anterior cingulate. Integrated coding of gain and loss (increasing activation throughout the full range, from biggest loss to biggest gain) occurred in the dorsal part of the anterior cingulate, at the border with the medial prefrontal cortex. Finally, unspecific activation increases to both gains and losses (increasing activation to increasing gains and increasing losses, possibly reflecting attention) occurred in dorsal and middle regions of the cingulate cortex. Together, these results suggest separate and common coding of monetary reward and punishment in distinct subregions of the cingulate cortex. Further meta-analysis suggested that the presently found reward- and punishment-specific areas overlapped with those processing positive and negative emotions, respectively.

  13. Integration Of SIMS Into A General Purpose IBA Data Analysis Code

    SciTech Connect

    Barradas, N. P.; Alves, E.; Alves, L. C.; Likonen, J.; Hakola, A.; Coad, P.; Widdowson, A.

    2011-06-01

    IBA techniques such as RBS, ERDA, NRA, or PIXE are highly complementary, and are often combined to maximize the extracted information. In particular, they have different sensitivities to various elements and probe different depth scales. The same is true for secondary ion mass spectrometry (SIMS), that can have much better detection limits for many species. Quantification of SIMS data normally requires careful calibration of the exact system being studied, and often the results are only semi-quantitative. Nevertheless, when SIMS is used together with other IBA techniques, it would be highly desirable to integrate the data analysis. We developed a routine to analyse SIMS data, and implemented it in NDF, a standard IBA data analysis code, that already supported RBS, ERDA, resonant and non-resonant NRA, and PIXE. Details of this new routine are presented in this work.

  14. Integration of the olfactory code across dendritic claws of single mushroom body neurons

    PubMed Central

    Gruntman, Eyal; Turner, Glenn C.

    2013-01-01

    In the olfactory system, sensory inputs are arranged in different glomerular channels, which respond in combinatorial ensembles to the various chemical features of an odor. Here we investigate where and how this combinatorial code is read out deeper in the brain. We exploit the unique morphology of neurons in the mushroom body (MB), which receive input on large dendritic claws. Imaging odor responses of these dendritic claws shows that input channels with distinct odor tuning converge on individual MB neurons. We determined how these inputs interact to drive the cell to spike threshold using intracellular recordings to examine MB responses to optogenetically controlled input. Our results provide an elegant explanation for the characteristic selectivity of MB neurons: these cells receive different types of input, and require those inputs to be coactive in order to spike. These results establish the MB as an important site of integration in the fly olfactory system. PMID:24141312

  15. Simulation study of HL-2A-like plasma using integrated predictive modeling code

    SciTech Connect

    Poolyarat, N.; Onjun, T.; Promping, J.

    2009-11-15

    Self-consistent simulations of HL-2A-like plasma are carried out using 1.5D BALDUR integrated predictive modeling code. In these simulations, the core transport is predicted using the combination of Multi-mode (MMM95) anomalous core transport model and NCLASS neoclassical transport model. The evolution of plasma current, temperature and density is carried out. Consequently, the plasma current, temperature and density profiles, as well as other plasma parameters, are obtained as the predictions in each simulation. It is found that temperature and density profiles in these simulations are peak near the plasma center. In addition, the sawtooth period is studied using the Porcilli model and is found that before, during, and after the electron cyclotron resonance heating (ECRH) operation the sawtooth period are approximately the same. It is also observed that the mixing radius of sawtooth crashes is reduced during the ECRH operation.

  16. A spectral approach integrating functional genomic annotations for coding and noncoding variants.

    PubMed

    Ionita-Laza, Iuliana; McCallum, Kenneth; Xu, Bin; Buxbaum, Joseph D

    2016-02-01

    Over the past few years, substantial effort has been put into the functional annotation of variation in human genome sequences. Such annotations can have a critical role in identifying putatively causal variants for a disease or trait among the abundant natural variation that occurs at a locus of interest. The main challenges in using these various annotations include their large numbers and their diversity. Here we develop an unsupervised approach to integrate these different annotations into one measure of functional importance (Eigen) that, unlike most existing methods, is not based on any labeled training data. We show that the resulting meta-score has better discriminatory ability using disease-associated and putatively benign variants from published studies (in both coding and noncoding regions) than the recently proposed CADD score. Across varied scenarios, the Eigen score performs generally better than any single individual annotation, representing a powerful single functional score that can be incorporated in fine-mapping studies.

  17. Thought Insertion as a Self-Disturbance: An Integration of Predictive Coding and Phenomenological Approaches

    PubMed Central

    Sterzer, Philipp; Mishara, Aaron L.; Voss, Martin; Heinz, Andreas

    2016-01-01

    Current theories in the framework of hierarchical predictive coding propose that positive symptoms of schizophrenia, such as delusions and hallucinations, arise from an alteration in Bayesian inference, the term inference referring to a process by which learned predictions are used to infer probable causes of sensory data. However, for one particularly striking and frequent symptom of schizophrenia, thought insertion, no plausible account has been proposed in terms of the predictive-coding framework. Here we propose that thought insertion is due to an altered experience of thoughts as coming from “nowhere”, as is already indicated by the early 20th century phenomenological accounts by the early Heidelberg School of psychiatry. These accounts identified thought insertion as one of the self-disturbances (from German: “Ichstörungen”) of schizophrenia and used mescaline as a model-psychosis in healthy individuals to explore the possible mechanisms. The early Heidelberg School (Gruhle, Mayer-Gross, Beringer) first named and defined the self-disturbances, and proposed that thought insertion involves a disruption of the inner connectedness of thoughts and experiences, and a “becoming sensory” of those thoughts experienced as inserted. This account offers a novel way to integrate the phenomenology of thought insertion with the predictive coding framework. We argue that the altered experience of thoughts may be caused by a reduced precision of context-dependent predictions, relative to sensory precision. According to the principles of Bayesian inference, this reduced precision leads to increased prediction-error signals evoked by the neural activity that encodes thoughts. Thus, in analogy with the prediction-error related aberrant salience of external events that has been proposed previously, “internal” events such as thoughts (including volitions, emotions and memories) can also be associated with increased prediction-error signaling and are thus imbued

  18. iRegNet3D: three-dimensional integrated regulatory network for the genomic analysis of coding and non-coding disease mutations.

    PubMed

    Liang, Siqi; Tippens, Nathaniel D; Zhou, Yaoda; Mort, Matthew; Stenson, Peter D; Cooper, David N; Yu, Haiyuan

    2017-01-18

    The mechanistic details of most disease-causing mutations remain poorly explored within the context of regulatory networks. We present a high-resolution three-dimensional integrated regulatory network (iRegNet3D) in the form of a web tool, where we resolve the interfaces of all known transcription factor (TF)-TF, TF-DNA and chromatin-chromatin interactions for the analysis of both coding and non-coding disease-associated mutations to obtain mechanistic insights into their functional impact. Using iRegNet3D, we find that disease-associated mutations may perturb the regulatory network through diverse mechanisms including chromatin looping. iRegNet3D promises to be an indispensable tool in large-scale sequencing and disease association studies.

  19. Integrated coding-aware intra-ONU scheduling for passive optical networks with inter-ONU traffic

    NASA Astrophysics Data System (ADS)

    Li, Yan; Dai, Shifang; Wu, Weiwei

    2016-12-01

    Recently, with the soaring of traffic among optical network units (ONUs), network coding (NC) is becoming an appealing technique for improving the performance of passive optical networks (PONs) with such inter-ONU traffic. However, in the existed NC-based PONs, NC can only be implemented by buffering inter-ONU traffic at the optical line terminal (OLT) to wait for the establishment of coding condition, such passive uncertain waiting severely limits the effect of NC technique. In this paper, we will study integrated coding-aware intra-ONU scheduling in which the scheduling of inter-ONU traffic within each ONU will be undertaken by the OLT to actively facilitate the forming of coding inter-ONU traffic based on the global inter-ONU traffic distribution, and then the performance of PONs with inter-ONU traffic can be significantly improved. We firstly design two report message patterns and an inter-ONU traffic transmission framework as the basis for the integrated coding-aware intra-ONU scheduling. Three specific scheduling strategies are then proposed for adapting diverse global inter-ONU traffic distributions. The effectiveness of the work is finally evaluated by both theoretical analysis and simulations.

  20. An Integrated RELAP5-3D and Multiphase CFD Code System Utilizing a Semi Implicit Coupling Technique

    SciTech Connect

    D.L. Aumiller; E.T. Tomlinson; W.L. Weaver

    2001-06-21

    An integrated code system consisting of RELAP5-3D and a multiphase CFD program has been created through the use of a generic semi-implicit coupling algorithm. Unlike previous CFD coupling work, this coupling scheme is numerically stable provided the material Courant limit is not violated in RELAP5-3D or at the coupling locations. The basis for the coupling scheme and details regarding the unique features associated with the application of this technique to a four-field CFD program are presented. Finally, the results of a verification problem are presented. The coupled code system is shown to yield accurate and numerically stable results.

  1. Self-consistent modeling of DEMOs with 1.5D BALDUR integrated predictive modeling code

    NASA Astrophysics Data System (ADS)

    Wisitsorasak, A.; Somjinda, B.; Promping, J.; Onjun, T.

    2017-02-01

    Self-consistent simulations of four DEMO designs proposed by teams from China, Europe, India, and Korea are carried out using the BALDUR integrated predictive modeling code in which theory-based models are used, for both core transport and boundary conditions. In these simulations, a combination of the NCLASS neoclassical transport and multimode (MMM95) anomalous transport model is used to compute a core transport. The boundary is taken to be at the top of the pedestal, where the pedestal values are described using a pedestal temperature model based on a combination of magnetic and flow shear stabilization, pedestal width scaling and an infinite- n ballooning pressure gradient model and a pedestal density model based on a line average density. Even though an optimistic scenario is considered, the simulation results suggest that, with the exclusion of ELMs, the fusion gain Q obtained for these reactors is pessimistic compared to their original designs, i.e. 52% for the Chinese design, 63% for the European design, 22% for the Korean design, and 26% for the Indian design. In addition, the predicted bootstrap current fractions are also found to be lower than their original designs, as fractions of their original designs, i.e. 0.49 (China), 0.66 (Europe), and 0.58 (India). Furthermore, in relation to sensitivity, it is found that increasing values of the auxiliary heating power and the electron line average density from their design values yield an enhancement of fusion performance. In addition, inclusion of sawtooth oscillation effects demonstrate positive impacts on the plasma and fusion performance in European, Indian and Korean DEMOs, but degrade the performance in the Chinese DEMO.

  2. Deterministic Multiaxial Creep and Creep Rupture Enhancements for CARES/Creep Integrated Design Code

    NASA Technical Reports Server (NTRS)

    Jadaan, Osama M.

    1998-01-01

    High temperature and long duration applications of monolithic ceramics can place their failure mode in the creep rupture regime. A previous model advanced by the authors described a methodology by which the creep rupture life of a loaded component can be predicted. That model was based on the life fraction damage accumulation rule in association with the modified Monkman-Grant creep rupture criterion. However, that model did not take into account the deteriorating state of the material due to creep damage (e.g., cavitation) as time elapsed. In addition, the material creep parameters used in that life prediction methodology, were based on uniaxial creep curves displaying primary and secondary creep behavior, with no tertiary regime. The objective of this paper is to present a creep life prediction methodology based on a modified form of the Kachanov-Rabotnov continuum damage mechanics (CDM) theory. In this theory, the uniaxial creep rate is described in terms of sum, temperature, time, and the current state of material damage. This scalar damage state parameter is basically an abstract measure of the current state of material damage due to creep deformation. The damage rate is assumed to vary with stress, temperature, time, and the current state of damage itself. Multiaxial creep and creep rupture formulations of the CDM approach are presented in this paper. Parameter estimation methodologies based on nonlinear regression analysis are also described for both, isothermal constant stress states and anisothermal variable stress conditions This creep life prediction methodology was preliminarily added to the integrated design code CARES/Creep (Ceramics Analysis and Reliability Evaluation of Structures/Creep), which is a postprocessor program to commercially available finite element analysis (FEA) packages. Two examples, showing comparisons between experimental and predicted creep lives of ceramic specimens, are used to demonstrate the viability of Ns methodology and the

  3. XML-Based Generator of C++ Code for Integration With GUIs

    NASA Technical Reports Server (NTRS)

    Hua, Hook; Oyafuso, Fabiano; Klimeck, Gerhard

    2003-01-01

    An open source computer program has been developed to satisfy a need for simplified organization of structured input data for scientific simulation programs. Typically, such input data are parsed in from a flat American Standard Code for Information Interchange (ASCII) text file into computational data structures. Also typically, when a graphical user interface (GUI) is used, there is a need to completely duplicate the input information while providing it to a user in a more structured form. Heretofore, the duplication of the input information has entailed duplication of software efforts and increases in susceptibility to software errors because of the concomitant need to maintain two independent input-handling mechanisms. The present program implements a method in which the input data for a simulation program are completely specified in an Extensible Markup Language (XML)-based text file. The key benefit for XML is storing input data in a structured manner. More importantly, XML allows not just storing of data but also describing what each of the data items are. That XML file contains information useful for rendering the data by other applications. It also then generates data structures in the C++ language that are to be used in the simulation program. In this method, all input data are specified in one place only, and it is easy to integrate the data structures into both the simulation program and the GUI. XML-to-C is useful in two ways: 1. As an executable, it generates the corresponding C++ classes and 2. As a library, it automatically fills the objects with the input data values.

  4. Managing Widely Disparate Code Bases Through Automation of Continuous Integration and Deployment

    NASA Astrophysics Data System (ADS)

    McLaughlin, B. D.; Joshi, T.

    2013-12-01

    NASA EOSDIS tools, services, and service endpoints are widely dispersed across different sub-agencies and sub-organizations. Each of these entities has a different set of skills and widely varying codebases. Some produce sophisticated, well-tested, stable and deployable code, while others are struggling to meet stringent requirements with limited resources. This disparity makes the process of partnering with and deploying code onto the Earthdata platform (https://earthdata.nasa.gov) difficult, even at times impossible. The Earthdata Code Collaborative (ECC) is a project repository and code hosting facility that addresses this problem directly through a three-tiered approach: 1. Provide a standardized set of testing and automation tools for all hosted projects. 2. Regularly report on bugs and features as well as testing coverage and success through Web-based tools. 3. Directly pipeline projects from the ECC into the Earthdata production environment. This session will explain the architecture behind the ECC, including the custom software and 3rd party tools used. It will also detail the process by which decisions were and are being made to arrive at a fully-automated suite of tools and tests that allow any code base to quickly improve its quality and become a candidate for Earthdata inclusion. The session is oriented towards developers, managers, and team members involved in the process of developing, testing, deploying, and ensuring the quality of a code base, whether that code base be tens of millions of lines of code or simply hundreds.

  5. Comparison of experimental pulse-height distributions in germanium detectors with integrated-tiger-series-code predictions

    SciTech Connect

    Beutler, D.E.; Halbleib, J.A. ); Knott, D.P. )

    1989-12-01

    This paper reports pulse-height distributions in two different types of Ge detectors measured for a variety of medium-energy x-ray bremsstrahlung spectra. These measurements have been compared to predictions using the integrated tiger series (ITS) Monte Carlo electron/photon transport code. In general, the authors find excellent agreement between experiments and predictions using no free parameters. These results demonstrate that the ITS codes can predict the combined bremsstrahlung production and energy deposition with good precision (within measurement uncertainties). The one region of disagreement observed occurs for low-energy (<50 keV) photons using low-energy bremsstrahlung spectra. In this case the ITS codes appear to underestimate the produced and/or absorbed radiation by almost an order of magnitude.

  6. Impact of copy number variations burden on coding genome in humans using integrated high resolution arrays.

    PubMed

    Veerappa, Avinash M; Lingaiah, Kusuma; Vishweswaraiah, Sangeetha; Murthy, Megha N; Suresh, Raviraj V; Manjegowda, Dinesh S; Ramachandra, Nallur B

    2014-12-16

    Copy number variations (CNVs) alter the transcriptional and translational levels of genes by disrupting the coding structure and this burden of CNVs seems to be a significant contributor to phenotypic variations. Therefore it was necessary to assess the complexities of CNV burden on the coding genome. A total of 1715 individuals from 12 populations were used for CNV analysis in the present investigation. Analysis was performed using Affymetrix Genome-Wide Human SNP Array 6·0 chip and CytoScan High-Density arrays. CNVs were more frequently observed in the coding region than in the non-coding region. CNVs were observed vastly more frequently in the coding region than the non-coding region. CNVs were found to be enriched in the regions containing functional genes (83-96%) compared with the regions containing pseudogenes (4-17%). CNVs across the genome of an individual showed multiple hits across many genes, whose proteins interact physically and function under the same pathway. We identified varying numbers of proteins and degrees of interactions within protein complexes of single individual genomes. This study represents the first draft of a population-specific CNV genes map as well as a cross-populational map. The complex relationship of CNVs on genes and their physically interacting partners unravels many complexities involved in phenotype expression. This study identifies four mechanisms contributing to the complexities caused by the presence of multiple CNVs across many genes in the coding part of the genome.

  7. Context-dependent signal integration by the GLI code: the oncogenic load, pathways, modifiers and implications for cancer therapy.

    PubMed

    Aberger, Fritz; Ruiz I Altaba, Ariel

    2014-09-01

    Canonical Hedgehog (HH) signaling leads to the regulation of the GLI code: the sum of all positive and negative functions of all GLI proteins. In humans, the three GLI factors encode context-dependent activities with GLI1 being mostly an activator and GLI3 often a repressor. Modulation of GLI activity occurs at multiple levels, including by co-factors and by direct modification of GLI structure. Surprisingly, the GLI proteins, and thus the GLI code, is also regulated by multiple inputs beyond HH signaling. In normal development and homeostasis these include a multitude of signaling pathways that regulate proto-oncogenes, which boost positive GLI function, as well as tumor suppressors, which restrict positive GLI activity. In cancer, the acquisition of oncogenic mutations and the loss of tumor suppressors - the oncogenic load - regulates the GLI code toward progressively more activating states. The fine and reversible balance of GLI activating GLI(A) and GLI repressing GLI(R) states is lost in cancer. Here, the acquisition of GLI(A) levels above a given threshold is predicted to lead to advanced malignant stages. In this review we highlight the concepts of the GLI code, the oncogenic load, the context-dependency of GLI action, and different modes of signaling integration such as that of HH and EGF. Targeting the GLI code directly or indirectly promises therapeutic benefits beyond the direct blockade of individual pathways.

  8. ITS Version 6 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2008-04-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of lineartime-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 6, the latest version of ITS, contains (1) improvements to the ITS 5.0 codes, and (2) conversion to Fortran 90. The general user friendliness of the software has been enhanced through memory allocation to reduce the need for users to modify and recompile the code.

  9. Creation of fully vectorized FORTRAN code for integrating the movement of dust grains in interplanetary environments

    NASA Technical Reports Server (NTRS)

    Colquitt, Walter

    1989-01-01

    The main objective is to improve the performance of a specific FORTRAN computer code from the Planetary Sciences Division of NASA/Johnson Space Center when used on a modern vectorizing supercomputer. The code is used to calculate orbits of dust grains that separate from comets and asteroids. This code accounts for influences of the sun and 8 planets (neglecting Pluto), solar wind, and solar light pressure including Poynting-Robertson drag. Calculations allow one to study the motion of these particles as they are influenced by the Earth or one of the other planets. Some of these particles become trapped just beyond the Earth for long periods of time. These integer period resonances vary from 3 orbits of the Earth and 2 orbits of the particles to as high as 14 to 13.

  10. Use of an Integrated Discrete Fracture Network Code for Stochastic Stability Analyses of Fractured Rock Masses

    NASA Astrophysics Data System (ADS)

    Merrien-Soukatchoff, V.; Korini, T.; Thoraval, A.

    2012-03-01

    The paper presents the Discrete Fracture Network code RESOBLOK, which couples geometrical block system construction and a quick iterative stability analysis in the same package. The deterministic or stochastic geometry of a fractured rock mass can be represented and interactively displayed in 3D using two different fracture generators: one mainly used for hydraulic purposes and another designed to allow block stability evaluation. RESOBLOK has downstream modules that can quickly compute stability (based on limit equilibrium or energy-based analysis), display geometric information and create links to other discrete software. The advantage of the code is that it couples stochastic geometrical representation and a quick iterative stability analysis to allow risk-analysis with or without reinforcement and, for the worst cases, more accurate analysis using stress-strain analysis computer codes. These different aspects are detailed for embankment and underground works.

  11. Generation of 238U Covariance Matrices by Using the Integral Data Assimilation Technique of the CONRAD Code

    NASA Astrophysics Data System (ADS)

    Privas, E.; Archier, P.; Bernard, D.; De Saint Jean, C.; Destouche, C.; Leconte, P.; Noguère, G.; Peneliau, Y.; Capote, R.

    2016-02-01

    A new IAEA Coordinated Research Project (CRP) aims to test, validate and improve the IRDF library. Among the isotopes of interest, the modelisation of the 238U capture and fission cross sections represents a challenging task. A new description of the 238U neutrons induced reactions in the fast energy range is within progress in the frame of an IAEA evaluation consortium. The Nuclear Data group of Cadarache participates in this effort utilizing the 238U spectral indices measurements and Post Irradiated Experiments (PIE) carried out in the fast reactors MASURCA (CEA Cadarache) and PHENIX (CEA Marcoule). Such a collection of experimental results provides reliable integral information on the (n,γ) and (n,f) cross sections. This paper presents the Integral Data Assimilation (IDA) technique of the CONRAD code used to propagate the uncertainties of the integral data on the 238U cross sections of interest for dosimetry applications.

  12. Design of time-pulse coded optoelectronic neuronal elements for nonlinear transformation and integration

    NASA Astrophysics Data System (ADS)

    Krasilenko, Vladimir G.; Nikolsky, Alexander I.; Lazarev, Alexander A.; Lazareva, Maria V.

    2008-03-01

    In the paper the actuality of neurophysiologically motivated neuron arrays with flexibly programmable functions and operations with possibility to select required accuracy and type of nonlinear transformation and learning are shown. We consider neurons design and simulation results of multichannel spatio-time algebraic accumulation - integration of optical signals. Advantages for nonlinear transformation and summation - integration are shown. The offered circuits are simple and can have intellectual properties such as learning and adaptation. The integrator-neuron is based on CMOS current mirrors and comparators. The performance: consumable power - 100...500 μW, signal period- 0.1...1ms, input optical signals power - 0.2...20 μW time delays - less 1μs, the number of optical signals - 2...10, integration time - 10...100 of signal periods, accuracy or integration error - about 1%. Various modifications of the neuron-integrators with improved performance and for different applications are considered in the paper.

  13. Semantic Integration and Age of Acquisition Effects in Code-Blend Comprehension

    ERIC Educational Resources Information Center

    Giezen, Marcel R.; Emmorey, Karen

    2016-01-01

    Semantic and lexical decision tasks were used to investigate the mechanisms underlying code-blend facilitation: the finding that hearing bimodal bilinguals comprehend signs in American Sign Language (ASL) and spoken English words more quickly when they are presented together simultaneously than when each is presented alone. More robust…

  14. Integrated genome analysis suggests that most conserved non-coding sequences are regulatory factor binding sites

    PubMed Central

    Hemberg, Martin; Gray, Jesse M.; Cloonan, Nicole; Kuersten, Scott; Grimmond, Sean; Greenberg, Michael E.; Kreiman, Gabriel

    2012-01-01

    More than 98% of a typical vertebrate genome does not code for proteins. Although non-coding regions are sprinkled with short (<200 bp) islands of evolutionarily conserved sequences, the function of most of these unannotated conserved islands remains unknown. One possibility is that unannotated conserved islands could encode non-coding RNAs (ncRNAs); alternatively, unannotated conserved islands could serve as promoter-distal regulatory factor binding sites (RFBSs) like enhancers. Here we assess these possibilities by comparing unannotated conserved islands in the human and mouse genomes to transcribed regions and to RFBSs, relying on a detailed case study of one human and one mouse cell type. We define transcribed regions by applying a novel transcript-calling algorithm to RNA-Seq data obtained from total cellular RNA, and we define RFBSs using ChIP-Seq and DNAse-hypersensitivity assays. We find that unannotated conserved islands are four times more likely to coincide with RFBSs than with unannotated ncRNAs. Thousands of conserved RFBSs can be categorized as insulators based on the presence of CTCF or as enhancers based on the presence of p300/CBP and H3K4me1. While many unannotated conserved RFBSs are transcriptionally active to some extent, the transcripts produced tend to be unspliced, non-polyadenylated and expressed at levels 10 to 100-fold lower than annotated coding or ncRNAs. Extending these findings across multiple cell types and tissues, we propose that most conserved non-coding genomic DNA in vertebrate genomes corresponds to promoter-distal regulatory elements. PMID:22684627

  15. Validating a Monotonically-Integrated Large Eddy Simulation Code for Subsonic Jet Acoustics

    NASA Technical Reports Server (NTRS)

    Ingraham, Daniel; Bridges, James

    2017-01-01

    The results of subsonic jet validation cases for the Naval Research Lab's Jet Engine Noise REduction (JENRE) code are reported. Two set points from the Tanna matrix, set point 3 (Ma = 0.5, unheated) and set point 7 (Ma = 0.9, unheated) are attempted on three different meshes. After a brief discussion of the JENRE code and the meshes constructed for this work, the turbulent statistics for the axial velocity are presented and compared to experimental data, with favorable results. Preliminary simulations for set point 23 (Ma = 0.5, Tj=T1 = 1.764) on one of the meshes are also described. Finally, the proposed configuration for the farfield noise prediction with JENRE's Ffowcs-Williams Hawking solver are detailed.

  16. Correlation and synchrony transfer in integrate-and-fire neurons: basic properties and consequences for coding.

    PubMed

    Shea-Brown, Eric; Josić, Kresimir; de la Rocha, Jaime; Doiron, Brent

    2008-03-14

    We study how pairs of neurons transfer correlated input currents into correlated spikes. Over rapid time scales, correlation transfer increases with both spike time variability and rate; the dependence on variability disappears at large time scales. This persists for a nonlinear membrane model and for heterogeneous cell pairs, but strong nonmonotonicities follow from refractory effects. We present consequences for population coding and for the encoding of time-varying stimuli.

  17. NCAD, a database integrating the intrinsic conformational preferences of non-coded amino acids

    PubMed Central

    Revilla-López, Guillem; Torras, Juan; Curcó, David; Casanovas, Jordi; Calaza, M. Isabel; Zanuy, David; Jiménez, Ana I.; Cativiela, Carlos; Nussinov, Ruth; Grodzinski, Piotr; Alemán, Carlos

    2010-01-01

    Peptides and proteins find an ever-increasing number of applications in the biomedical and materials engineering fields. The use of non-proteinogenic amino acids endowed with diverse physicochemical and structural features opens the possibility to design proteins and peptides with novel properties and functions. Moreover, non-proteinogenic residues are particularly useful to control the three-dimensional arrangement of peptidic chains, which is a crucial issue for most applications. However, information regarding such amino acids –also called non-coded, non-canonical or non-standard– is usually scattered among publications specialized in quite diverse fields as well as in patents. Making all these data useful to the scientific community requires new tools and a framework for their assembly and coherent organization. We have successfully compiled, organized and built a database (NCAD, Non-Coded Amino acids Database) containing information about the intrinsic conformational preferences of non-proteinogenic residues determined by quantum mechanical calculations, as well as bibliographic information about their synthesis, physical and spectroscopic characterization, conformational propensities established experimentally, and applications. The architecture of the database is presented in this work together with the first family of non-coded residues included, namely, α-tetrasubstituted α-amino acids. Furthermore, the NCAD usefulness is demonstrated through a test-case application example. PMID:20455555

  18. Orbitofrontal Cortex Signals Expected Outcomes with Predictive Codes When Stable Contingencies Promote the Integration of Reward History.

    PubMed

    Riceberg, Justin S; Shapiro, Matthew L

    2017-02-22

    Memory can inform goal-directed behavior by linking current opportunities to past outcomes. The orbitofrontal cortex (OFC) may guide value-based responses by integrating the history of stimulus-reward associations into expected outcomes, representations of predicted hedonic value and quality. Alternatively, the OFC may rapidly compute flexible "online" reward predictions by associating stimuli with the latest outcome. OFC neurons develop predictive codes when rats learn to associate arbitrary stimuli with outcomes, but the extent to which predictive coding depends on most recent events and the integrated history of rewards is unclear. To investigate how reward history modulates OFC activity, we recorded OFC ensembles as rats performed spatial discriminations that differed only in the number of rewarded trials between goal reversals. The firing rate of single OFC neurons distinguished identical behaviors guided by different goals. When >20 rewarded trials separated goal switches, OFC ensembles developed stable and anticorrelated population vectors that predicted overall choice accuracy and the goal selected in single trials. When <10 rewarded trials separated goal switches, OFC population vectors decorrelated rapidly after each switch, but did not develop anticorrelated firing patterns or predict choice accuracy. The results show that, whereas OFC signals respond rapidly to contingency changes, they predict choices only when reward history is relatively stable, suggesting that consecutive rewarded episodes are needed for OFC computations that integrate reward history into expected outcomes.SIGNIFICANCE STATEMENT Adapting to changing contingencies and making decisions engages the orbitofrontal cortex (OFC). Previous work shows that OFC function can either improve or impair learning depending on reward stability, suggesting that OFC guides behavior optimally when contingencies apply consistently. The mechanisms that link reward history to OFC computations remain

  19. Assessment of Spacecraft Systems Integration Using the Electric Propulsion Interactions Code (EPIC)

    NASA Technical Reports Server (NTRS)

    Mikellides, Ioannis G.; Kuharski, Robert A.; Mandell, Myron J.; Gardner, Barbara M.; Kauffman, William J. (Technical Monitor)

    2002-01-01

    SAIC is currently developing the Electric Propulsion Interactions Code 'EPIC', an interactive computer tool that allows the construction of a 3-D spacecraft model, and the assessment of interactions between its subsystems and the plume from an electric thruster. EPIC unites different computer tools to address the complexity associated with the interaction processes. This paper describes the overall architecture and capability of EPIC including the physics and algorithms that comprise its various components. Results from selected modeling efforts of different spacecraft-thruster systems are also presented.

  20. Are interaural time and level differences represented by independent or integrated codes in the human auditory cortex?

    PubMed

    Edmonds, Barrie A; Krumbholz, Katrin

    2014-02-01

    Sound localization is important for orienting and focusing attention and for segregating sounds from different sources in the environment. In humans, horizontal sound localization mainly relies on interaural differences in sound arrival time and sound level. Despite their perceptual importance, the neural processing of interaural time and level differences (ITDs and ILDs) remains poorly understood. Animal studies suggest that, in the brainstem, ITDs and ILDs are processed independently by different specialized circuits. The aim of the current study was to investigate whether, at higher processing levels, they remain independent or are integrated into a common code of sound laterality. For that, we measured late auditory cortical potentials in response to changes in sound lateralization elicited by perceptually matched changes in ITD and/or ILD. The responses to the ITD and ILD changes exhibited significant morphological differences. At the same time, however, they originated from overlapping areas of the cortex and showed clear evidence for functional coupling. These results suggest that the auditory cortex contains an integrated code of sound laterality, but also retains independent information about ITD and ILD cues. This cue-related information might be used to assess how consistent the cues are, and thus, how likely they would have arisen from the same source.

  1. LORD: a phenotype-genotype semantically integrated biomedical data tool to support rare disease diagnosis coding in health information systems.

    PubMed

    Choquet, Remy; Maaroufi, Meriem; Fonjallaz, Yannick; de Carrara, Albane; Vandenbussche, Pierre-Yves; Dhombres, Ferdinand; Landais, Paul

    Characterizing a rare disease diagnosis for a given patient is often made through expert's networks. It is a complex task that could evolve over time depending on the natural history of the disease and the evolution of the scientific knowledge. Most rare diseases have genetic causes and recent improvements of sequencing techniques contribute to the discovery of many new diseases every year. Diagnosis coding in the rare disease field requires data from multiple knowledge bases to be aggregated in order to offer the clinician a global information space from possible diagnosis to clinical signs (phenotypes) and known genetic mutations (genotype). Nowadays, the major barrier to the coding activity is the lack of consolidation of such information scattered in different thesaurus such as Orphanet, OMIM or HPO. The Linking Open data for Rare Diseases (LORD) web portal we developed stands as the first attempt to fill this gap by offering an integrated view of 8,400 rare diseases linked to more than 14,500 signs and 3,270 genes. The application provides a browsing feature to navigate through the relationships between diseases, signs and genes, and some Application Programming Interfaces to help its integration in health information systems in routine.

  2. LORD: a phenotype-genotype semantically integrated biomedical data tool to support rare disease diagnosis coding in health information systems

    PubMed Central

    Choquet, Remy; Maaroufi, Meriem; Fonjallaz, Yannick; de Carrara, Albane; Vandenbussche, Pierre-Yves; Dhombres, Ferdinand; Landais, Paul

    2015-01-01

    Characterizing a rare disease diagnosis for a given patient is often made through expert’s networks. It is a complex task that could evolve over time depending on the natural history of the disease and the evolution of the scientific knowledge. Most rare diseases have genetic causes and recent improvements of sequencing techniques contribute to the discovery of many new diseases every year. Diagnosis coding in the rare disease field requires data from multiple knowledge bases to be aggregated in order to offer the clinician a global information space from possible diagnosis to clinical signs (phenotypes) and known genetic mutations (genotype). Nowadays, the major barrier to the coding activity is the lack of consolidation of such information scattered in different thesaurus such as Orphanet, OMIM or HPO. The Linking Open data for Rare Diseases (LORD) web portal we developed stands as the first attempt to fill this gap by offering an integrated view of 8,400 rare diseases linked to more than 14,500 signs and 3,270 genes. The application provides a browsing feature to navigate through the relationships between diseases, signs and genes, and some Application Programming Interfaces to help its integration in health information systems in routine. PMID:26958175

  3. Integration of Multifidelity Multidisciplinary Computer Codes for Design and Analysis of Supersonic Aircraft

    NASA Technical Reports Server (NTRS)

    Geiselhart, Karl A.; Ozoroski, Lori P.; Fenbert, James W.; Shields, Elwood W.; Li, Wu

    2011-01-01

    This paper documents the development of a conceptual level integrated process for design and analysis of efficient and environmentally acceptable supersonic aircraft. To overcome the technical challenges to achieve this goal, a conceptual design capability which provides users with the ability to examine the integrated solution between all disciplines and facilitates the application of multidiscipline design, analysis, and optimization on a scale greater than previously achieved, is needed. The described capability is both an interactive design environment as well as a high powered optimization system with a unique blend of low, mixed and high-fidelity engineering tools combined together in the software integration framework, ModelCenter. The various modules are described and capabilities of the system are demonstrated. The current limitations and proposed future enhancements are also discussed.

  4. Modeling development of natural multi-sensory integration using neural self-organisation and probabilistic population codes

    NASA Astrophysics Data System (ADS)

    Bauer, Johannes; Dávila-Chacón, Jorge; Wermter, Stefan

    2015-10-01

    Humans and other animals have been shown to perform near-optimally in multi-sensory integration tasks. Probabilistic population codes (PPCs) have been proposed as a mechanism by which optimal integration can be accomplished. Previous approaches have focussed on how neural networks might produce PPCs from sensory input or perform calculations using them, like combining multiple PPCs. Less attention has been given to the question of how the necessary organisation of neurons can arise and how the required knowledge about the input statistics can be learned. In this paper, we propose a model of learning multi-sensory integration based on an unsupervised learning algorithm in which an artificial neural network learns the noise characteristics of each of its sources of input. Our algorithm borrows from the self-organising map the ability to learn latent-variable models of the input and extends it to learning to produce a PPC approximating a probability density function over the latent variable behind its (noisy) input. The neurons in our network are only required to perform simple calculations and we make few assumptions about input noise properties and tuning functions. We report on a neurorobotic experiment in which we apply our algorithm to multi-sensory integration in a humanoid robot to demonstrate its effectiveness and compare it to human multi-sensory integration on the behavioural level. We also show in simulations that our algorithm performs near-optimally under certain plausible conditions, and that it reproduces important aspects of natural multi-sensory integration on the neural level.

  5. Hybrid information privacy system: integration of chaotic neural network and RSA coding

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Willey, Jeff; Lee, Ting N.; Szu, Harold H.

    2005-03-01

    Electronic mails are adopted worldwide; most are easily hacked by hackers. In this paper, we purposed a free, fast and convenient hybrid privacy system to protect email communication. The privacy system is implemented by combining private security RSA algorithm with specific chaos neural network encryption process. The receiver can decrypt received email as long as it can reproduce the specified chaos neural network series, so called spatial-temporal keys. The chaotic typing and initial seed value of chaos neural network series, encrypted by the RSA algorithm, can reproduce spatial-temporal keys. The encrypted chaotic typing and initial seed value are hidden in watermark mixed nonlinearly with message media, wrapped with convolution error correction codes for wireless 3rd generation cellular phones. The message media can be an arbitrary image. The pattern noise has to be considered during transmission and it could affect/change the spatial-temporal keys. Since any change/modification on chaotic typing or initial seed value of chaos neural network series is not acceptable, the RSA codec system must be robust and fault-tolerant via wireless channel. The robust and fault-tolerant properties of chaos neural networks (CNN) were proved by a field theory of Associative Memory by Szu in 1997. The 1-D chaos generating nodes from the logistic map having arbitrarily negative slope a = p/q generating the N-shaped sigmoid was given first by Szu in 1992. In this paper, we simulated the robust and fault-tolerance properties of CNN under additive noise and pattern noise. We also implement a private version of RSA coding and chaos encryption process on messages.

  6. Integral and Separate Effects Tests for Thermal Hydraulics Code Validation for Liquid-Salt Cooled Nuclear Reactors

    SciTech Connect

    Peterson, Per

    2012-10-30

    The objective of the 3-year project was to collect integral effects test (IET) data to validate the RELAP5-3D code and other thermal hydraulics codes for use in predicting the transient thermal hydraulics response of liquid salt cooled reactor systems, including integral transient response for forced and natural circulation operation. The reference system for the project is a modular, 900-MWth Pebble Bed Advanced High Temperature Reactor (PB-AHTR), a specific type of Fluoride salt-cooled High temperature Reactor (FHR). Two experimental facilities were developed for thermal-hydraulic integral effects tests (IETs) and separate effects tests (SETs). The facilities use simulant fluids for the liquid fluoride salts, with very little distortion to the heat transfer and fluid dynamics behavior. The CIET Test Bay facility was designed, built, and operated. IET data for steady state and transient natural circulation was collected. SET data for convective heat transfer in pebble beds and straight channel geometries was collected. The facility continues to be operational and will be used for future experiments, and for component development. The CIET 2 facility is larger in scope, and its construction and operation has a longer timeline than the duration of this grant. The design for the CIET 2 facility has drawn heavily on the experience and data collected on the CIET Test Bay, and it was completed in parallel with operation of the CIET Test Bay. CIET 2 will demonstrate start-up and shut-down transients and control logic, in addition to LOFC and LOHS transients, and buoyant shut down rod operation during transients. Design of the CIET 2 Facility is complete, and engineering drawings have been submitted to an external vendor for outsourced quality controlled construction. CIET 2 construction and operation continue under another NEUP grant. IET data from both CIET facilities is to be used for validation of system codes used for FHR modeling, such as RELAP5-3D. A set of

  7. Integration of the DRAGON5/DONJON5 codes in the SALOME platform for performing multi-physics calculations in nuclear engineering

    NASA Astrophysics Data System (ADS)

    Hébert, Alain

    2014-06-01

    We are presenting the computer science techniques involved in the integration of codes DRAGON5 and DONJON5 in the SALOME platform. This integration brings new capabilities in designing multi-physics computational schemes, with the possibility to couple our reactor physics codes with thermal-hydraulics or thermo-mechanics codes from other organizations. A demonstration is presented where two code components are coupled using the YACS module of SALOME, based on the CORBA protocol. The first component is a full-core 3D steady-state neuronic calculation in a PWR performed using DONJON5. The second component implement a set of 1D thermal-hydraulics calculations, each performed over a single assembly.

  8. Uncertainty evaluation of nuclear reaction model parameters using integral and microscopic measurements. Covariances evaluation with CONRAD code

    NASA Astrophysics Data System (ADS)

    de Saint Jean, C.; Habert, B.; Archier, P.; Noguere, G.; Bernard, D.; Tommasi, J.; Blaise, P.

    2010-10-01

    In the [eV;MeV] energy range, modelling of the neutron induced reactions are based on nuclear reaction models having parameters. Estimation of co-variances on cross sections or on nuclear reaction model parameters is a recurrent puzzle in nuclear data evaluation. Major breakthroughs were asked by nuclear reactor physicists to assess proper uncertainties to be used in applications. In this paper, mathematical methods developped in the CONRAD code[2] will be presented to explain the treatment of all type of uncertainties, including experimental ones (statistical and systematic) and propagate them to nuclear reaction model parameters or cross sections. Marginalization procedure will thus be exposed using analytical or Monte-Carlo solutions. Furthermore, one major drawback found by reactor physicist is the fact that integral or analytical experiments (reactor mock-up or simple integral experiment, e.g. ICSBEP, …) were not taken into account sufficiently soon in the evaluation process to remove discrepancies. In this paper, we will describe a mathematical framework to take into account properly this kind of information.

  9. Genomic integration of the full-length dystrophin coding sequence in Duchenne muscular dystrophy induced pluripotent stem cells.

    PubMed

    Farruggio, Alfonso P; Bhakta, Mital S; du Bois, Haley; Ma, Julia; P Calos, Michele

    2017-04-01

    The plasmid vectors that express the full-length human dystrophin coding sequence in human cells was developed. Dystrophin, the protein mutated in Duchenne muscular dystrophy, is extraordinarily large, providing challenges for cloning and plasmid production in Escherichia coli. The authors expressed dystrophin from the strong, widely expressed CAG promoter, along with co-transcribed luciferase and mCherry marker genes useful for tracking plasmid expression. Introns were added at the 3' and 5' ends of the dystrophin sequence to prevent translation in E. coli, resulting in improved plasmid yield. Stability and yield were further improved by employing a lower-copy number plasmid origin of replication. The dystrophin plasmids also carried an attB site recognized by phage phiC31 integrase, enabling the plasmids to be integrated into the human genome at preferred locations by phiC31 integrase. The authors demonstrated single-copy integration of plasmid DNA into the genome and production of human dystrophin in the human 293 cell line, as well as in induced pluripotent stem cells derived from a patient with Duchenne muscular dystrophy. Plasmid-mediated dystrophin expression was also demonstrated in mouse muscle. The dystrophin expression plasmids described here will be useful in cell and gene therapy studies aimed at ameliorating Duchenne muscular dystrophy.

  10. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    SciTech Connect

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2006-07-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for ansforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with general

  11. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Code Reference Manual

    SciTech Connect

    C. L. Smith; K. J. Kvarfordt; S. T. Wood

    2008-08-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer. SAPHIRE is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). The INL's primary role in this project is that of software developer. However, the INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users comprised of a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. SAPHIRE can be used to model a complex system’s response to initiating events, quantify associated damage outcome frequencies, and identify important contributors to this damage (Level 1 PRA) and to analyze containment performance during a severe accident and quantify radioactive releases (Level 2 PRA). It can be used for a PRA evaluating a variety of operating conditions, for example, for a nuclear reactor at full power, low power, or at shutdown conditions. Furthermore, SAPHIRE can be used to analyze both internal and external initiating events and has special features for transforming models built for internal event analysis to models for external event analysis. It can also be used in a limited manner to quantify risk in terms of release consequences to both the public and the environment (Level 3 PRA). SAPHIRE includes a separate module called the Graphical Evaluation Module (GEM). GEM provides a highly specialized user interface with SAPHIRE that automates SAPHIRE process steps for evaluating operational events at commercial nuclear power plants. Using GEM, an analyst can estimate the risk associated with operational events in a very efficient and expeditious manner. This reference guide will introduce the SAPHIRE Version 7.0 software. A brief discussion of the purpose and history of the software is included along with general

  12. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  13. Automated model integration at source code level: An approach for implementing models into the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Wang, S.; Peters-Lidard, C. D.; Mocko, D. M.; Kumar, S.; Nearing, G. S.; Arsenault, K. R.; Geiger, J. V.

    2014-12-01

    Model integration bridges the data flow between modeling frameworks and models. However, models usually do not fit directly into a particular modeling environment, if not designed for it. An example includes implementing different types of models into the NASA Land Information System (LIS), a software framework for land-surface modeling and data assimilation. Model implementation requires scientific knowledge and software expertise and may take a developer months to learn LIS and model software structure. Debugging and testing of the model implementation is also time-consuming due to not fully understanding LIS or the model. This time spent is costly for research and operational projects. To address this issue, an approach has been developed to automate model integration into LIS. With this in mind, a general model interface was designed to retrieve forcing inputs, parameters, and state variables needed by the model and to provide as state variables and outputs to LIS. Every model can be wrapped to comply with the interface, usually with a FORTRAN 90 subroutine. Development efforts need only knowledge of the model and basic programming skills. With such wrappers, the logic is the same for implementing all models. Code templates defined for this general model interface could be re-used with any specific model. Therefore, the model implementation can be done automatically. An automated model implementation toolkit was developed with Microsoft Excel and its built-in VBA language. It allows model specifications in three worksheets and contains FORTRAN 90 code templates in VBA programs. According to the model specification, the toolkit generates data structures and procedures within FORTRAN modules and subroutines, which transfer data between LIS and the model wrapper. Model implementation is standardized, and about 80 - 90% of the development load is reduced. In this presentation, the automated model implementation approach is described along with LIS programming

  14. Challenge problem and milestones for : Nuclear Energy Advanced Modeling and Simulation (NEAMS) waste Integrated Performance and Safety Codes (IPSC).

    SciTech Connect

    Freeze, Geoffrey A.; Wang, Yifeng; Howard, Robert; McNeish, Jerry A.; Schultz, Peter Andrew; Arguello, Jose Guadalupe, Jr.

    2010-09-01

    This report describes the specification of a challenge problem and associated challenge milestones for the Waste Integrated Performance and Safety Codes (IPSC) supporting the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The NEAMS challenge problems are designed to demonstrate proof of concept and progress towards IPSC goals. The goal of the Waste IPSC is to develop an integrated suite of modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs, and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. To demonstrate proof of concept and progress towards these goals and requirements, a Waste IPSC challenge problem is specified that includes coupled thermal-hydrologic-chemical-mechanical (THCM) processes that describe (1) the degradation of a borosilicate glass waste form and the corresponding mobilization of radionuclides (i.e., the processes that produce the radionuclide source term), (2) the associated near-field physical and chemical environment for waste emplacement within a salt formation, and (3) radionuclide transport in the near field (i.e., through the engineered components - waste form, waste package, and backfill - and the immediately adjacent salt). The initial details of a set of challenge milestones that collectively comprise the full challenge problem are also specified.

  15. ASKI: A modular toolbox for scattering-integral-based seismic full waveform inversion and sensitivity analysis utilizing external forward codes

    NASA Astrophysics Data System (ADS)

    Schumacher, Florian; Friederich, Wolfgang

    Due to increasing computational resources, the development of new numerically demanding methods and software for imaging Earth's interior remains of high interest in Earth sciences. Here, we give a description from a user's and programmer's perspective of the highly modular, flexible and extendable software package ASKI-Analysis of Sensitivity and Kernel Inversion-recently developed for iterative scattering-integral-based seismic full waveform inversion. In ASKI, the three fundamental steps of solving the seismic forward problem, computing waveform sensitivity kernels and deriving a model update are solved by independent software programs that interact via file output/input only. Furthermore, the spatial discretizations of the model space used for solving the seismic forward problem and for deriving model updates, respectively, are kept completely independent. For this reason, ASKI does not contain a specific forward solver but instead provides a general interface to established community wave propagation codes. Moreover, the third fundamental step of deriving a model update can be repeated at relatively low costs applying different kinds of model regularization or re-selecting/weighting the inverted dataset without need to re-solve the forward problem or re-compute the kernels. Additionally, ASKI offers the user sensitivity and resolution analysis tools based on the full sensitivity matrix and allows to compose customized workflows in a consistent computational environment. ASKI is written in modern Fortran and Python, it is well documented and freely available under terms of the GNU General Public License (http://www.rub.de/aski).

  16. ITS version 5.0 : the integrated TIGER series of coupled electron/photon Monte Carlo transport codes.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2004-06-01

    ITS is a powerful and user-friendly software package permitting state of the art Monte Carlo solution of linear time-independent couple electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2)multigroup codes with adjoint transport capabilities, and (3) parallel implementations of all ITS codes. Moreover the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  17. Design and Simulation of Material-Integrated Distributed Sensor Processing with a Code-Based Agent Platform and Mobile Multi-Agent Systems

    PubMed Central

    Bosse, Stefan

    2015-01-01

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550

  18. Design and simulation of material-integrated distributed sensor processing with a code-based agent platform and mobile multi-agent systems.

    PubMed

    Bosse, Stefan

    2015-02-16

    Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.

  19. A 2×2 imaging MIMO system based on LED Visible Light Communications employing space balanced coding and integrated PIN array reception

    NASA Astrophysics Data System (ADS)

    Li, Jiehui; Xu, Yinfan; Shi, Jianyang; Wang, Yuanquan; Ji, Xinming; Ou, Haiyan; Chi, Nan

    2016-05-01

    In this paper, we proposed a 2×2 imaging Multi-Input Multi-Output (MIMO)-Visible Light Communication (VLC) system by employing Space Balanced Coding (SBC) based on two RGB LEDs and integrated PIN array reception. We experimentally demonstrated 1.4-Gbit/s VLC transmission at a distance of 2.5 m. The proposed imaging system not only overcomes the limitation of bandwidth existing in LEDs, but also can reject the second-order nonlinearity distortion. It turned out to be very promising to use integrated antennas in the VLC system in the future.

  20. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-10-01

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  1. Self-Powered Forward Error-Correcting Biosensor Based on Integration of Paper-Based Microfluidics and Self-Assembled Quick Response Codes.

    PubMed

    Yuan, Mingquan; Liu, Keng-Ku; Singamaneni, Srikanth; Chakrabartty, Shantanu

    2016-09-06

    This paper extends our previous work on silver-enhancement based self-assembling structures for designing reliable, self-powered biosensors with forward error correcting (FEC) capability. At the core of the proposed approach is the integration of paper-based microfluidics with quick response (QR) codes that can be optically scanned using a smart-phone. The scanned information is first decoded to obtain the location of a web-server which further processes the self-assembled QR image to determine the concentration of target analytes. The integration substrate for the proposed FEC biosensor is polyethylene and the patterning of the QR code on the substrate has been achieved using a combination of low-cost ink-jet printing and a regular ballpoint dispensing pen. A paper-based microfluidics channel has been integrated underneath the substrate for acquiring, mixing and flowing the sample to areas on the substrate where different parts of the code can self-assemble in presence of immobilized gold nanorods. In this paper we demonstrate the proof-of-concept detection using prototypes of QR encoded FEC biosensors.

  2. Abiding by codes of ethics and codes of conduct imposed on members of learned and professional geoscience institutions and - a tiresome formality or a win-win for scientific and professional integrity and protection of the public?

    NASA Astrophysics Data System (ADS)

    Allington, Ruth; Fernandez, Isabel

    2015-04-01

    In 2012, the International Union of Geological Sciences (IUGS) formed the Task Group on Global Geoscience Professionalism ("TG-GGP") to bring together the expanding network of organizations around the world whose primary purpose is self-regulation of geoscience practice. An important part of TG-GGP's mission is to foster a shared understanding of aspects of professionalism relevant to individual scientists and applied practitioners working in one or more sectors of the wider geoscience profession (e.g. research, teaching, industry, geoscience communication and government service). These may be summarised as competence, ethical practice, and professional, technical and scientific accountability. Legal regimes for the oversight of registered or licensed professionals differ around the world and in many jurisdictions there is no registration or licensure with the force of law. However, principles of peer-based self-regulation universally apply. This makes professional geoscience organisations ideal settings within which geoscientists can debate and agree what society should expect of us in the range of roles we fulfil. They can provide the structures needed to best determine what expectations, in the public interest, are appropriate for us collectively to impose on each other. They can also provide the structures for the development of associated procedures necessary to identify and discipline those who do not live up to the expected standards of behaviour established by consensus between peers. Codes of Ethics (sometimes referred to as Codes of Conduct), to which all members of all major professional and/or scientific geoscience organizations are bound (whether or not they are registered or hold professional qualifications awarded by those organisations), incorporate such traditional tenets as: safeguarding the health and safety of the public, scientific integrity, and fairness. Codes also increasingly include obligations concerning welfare of the environment and

  3. Model-Based Least Squares Reconstruction of Coded Source Neutron Radiographs: Integrating the ORNL HFIR CG1D Source Model

    SciTech Connect

    Santos-Villalobos, Hector J; Gregor, Jens; Bingham, Philip R

    2014-01-01

    At the present, neutron sources cannot be fabricated small and powerful enough in order to achieve high resolution radiography while maintaining an adequate flux. One solution is to employ computational imaging techniques such as a Magnified Coded Source Imaging (CSI) system. A coded-mask is placed between the neutron source and the object. The system resolution is increased by reducing the size of the mask holes and the flux is increased by increasing the size of the coded-mask and/or the number of holes. One limitation of such system is that the resolution of current state-of-the-art scintillator-based detectors caps around 50um. To overcome this challenge, the coded-mask and object are magnified by making the distance from the coded-mask to the object much smaller than the distance from object to detector. In previous work, we have shown via synthetic experiments that our least squares method outperforms other methods in image quality and reconstruction precision because of the modeling of the CSI system components. However, the validation experiments were limited to simplistic neutron sources. In this work, we aim to model the flux distribution of a real neutron source and incorporate such a model in our least squares computational system. We provide a full description of the methodology used to characterize the neutron source and validate the method with synthetic experiments.

  4. IN-MACA-MCC: Integrated Multiple Attractor Cellular Automata with Modified Clonal Classifier for Human Protein Coding and Promoter Prediction.

    PubMed

    Pokkuluri, Kiran Sree; Inampudi, Ramesh Babu; Nedunuri, S S S N Usha Devi

    2014-01-01

    Protein coding and promoter region predictions are very important challenges of bioinformatics (Attwood and Teresa, 2000). The identification of these regions plays a crucial role in understanding the genes. Many novel computational and mathematical methods are introduced as well as existing methods that are getting refined for predicting both of the regions separately; still there is a scope for improvement. We propose a classifier that is built with MACA (multiple attractor cellular automata) and MCC (modified clonal classifier) to predict both regions with a single classifier. The proposed classifier is trained and tested with Fickett and Tung (1992) datasets for protein coding region prediction for DNA sequences of lengths 54, 108, and 162. This classifier is trained and tested with MMCRI datasets for protein coding region prediction for DNA sequences of lengths 252 and 354. The proposed classifier is trained and tested with promoter sequences from DBTSS (Yamashita et al., 2006) dataset and nonpromoters from EID (Saxonov et al., 2000) and UTRdb (Pesole et al., 2002) datasets. The proposed model can predict both regions with an average accuracy of 90.5% for promoter and 89.6% for protein coding region predictions. The specificity and sensitivity values of promoter and protein coding region predictions are 0.89 and 0.92, respectively.

  5. Polar Codes

    DTIC Science & Technology

    2014-12-01

    density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes. iii CONTENTS EXECUTIVE SUMMARY...the most common. Many civilian systems use low density parity check (LDPC) FEC codes, and the Navy is planning to use LDPC for some future systems...other forward error correction methods: a turbo code, a low density parity check (LDPC) code, a Reed–Solomon code, and three convolutional codes

  6. Genome-wide conserved non-coding microsatellite (CNMS) marker-based integrative genetical genomics for quantitative dissection of seed weight in chickpea.

    PubMed

    Bajaj, Deepak; Saxena, Maneesha S; Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Tripathi, Shailesh; Upadhyaya, Hari D; Gowda, C L L; Sharma, Shivali; Singh, Sube; Tyagi, Akhilesh K; Parida, Swarup K

    2015-03-01

    Phylogenetic footprinting identified 666 genome-wide paralogous and orthologous CNMS (conserved non-coding microsatellite) markers from 5'-untranslated and regulatory regions (URRs) of 603 protein-coding chickpea genes. The (CT)n and (GA)n CNMS carrying CTRMCAMV35S and GAGA8BKN3 regulatory elements, respectively, are abundant in the chickpea genome. The mapped genic CNMS markers with robust amplification efficiencies (94.7%) detected higher intraspecific polymorphic potential (37.6%) among genotypes, implying their immense utility in chickpea breeding and genetic analyses. Seventeen differentially expressed CNMS marker-associated genes showing strong preferential and seed tissue/developmental stage-specific expression in contrasting genotypes were selected to narrow down the gene targets underlying seed weight quantitative trait loci (QTLs)/eQTLs (expression QTLs) through integrative genetical genomics. The integration of transcript profiling with seed weight QTL/eQTL mapping, molecular haplotyping, and association analyses identified potential molecular tags (GAGA8BKN3 and RAV1AAT regulatory elements and alleles/haplotypes) in the LOB-domain-containing protein- and KANADI protein-encoding transcription factor genes controlling the cis-regulated expression for seed weight in the chickpea. This emphasizes the potential of CNMS marker-based integrative genetical genomics for the quantitative genetic dissection of complex seed weight in chickpea.

  7. A framework for identifying genotypic information from clinical records: exploiting integrated ontology structures to transfer annotations between ICD codes and Gene Ontologies.

    PubMed

    Hashemikhabir, Seyedsasan; Xia, Ran; Xiang, Yang; Janga, Sarath

    2015-09-18

    Although some methods are proposed for automatic ontology generation, none of them address the issue of integrating large-scale heterogeneous biomedical ontologies. We propose a novel approach for integrating various types of ontologies efficiently and apply it to integrate International Classification of Diseases, Ninth Revision, Clinical Modification (ICD9CM) and Gene Ontologies (GO). This approach is one of the early attempts to quantify the associations among clinical terms (e.g. ICD9 codes) based on their corresponding genomic relationships. We reconstructed a merged tree for a partial set of GO and ICD9 codes and measured the performance of this tree in terms of associations' relevance by comparing them with two well-known disease-gene datasets (i.e. MalaCards and Disease Ontology). Furthermore, we compared the genomic-based ICD9 associations to temporal relationships between them from electronic health records. Our analysis shows promising associations supported by both comparisons suggesting a high reliability. We also manually analyzed several significant associations and found promising support from literature.

  8. iTOUGH2-IFC: An Integrated Flow Code in Support of Nagra's Probabilistic Safety Assessment:--User's Guide and Model Description

    SciTech Connect

    Finsterle, Stefan A.

    2009-01-02

    This document describes the development and use of the Integrated Flow Code (IFC), a numerical code and related model to be used for the simulation of time-dependent, two-phase flow in the near field and geosphere of a gas-generating nuclear waste repository system located in an initially fully water-saturated claystone (Opalinus Clay) in Switzerland. The development of the code and model was supported by the Swiss National Cooperative for the Disposal of Radioactive Waste (Nagra), Wettingen, Switzerland. Gas generation (mainly H{sub 2}, but also CH{sub 4} and CO{sub 2}) may affect repository performance by (1) compromising the engineered barriers through excessive pressure build-up, (2) displacing potentially contaminated pore water, (3) releasing radioactive gases (e.g., those containing {sup 14}C and {sup 3}H), (4) changing hydrogeologic properties of the engineered barrier system and the host rock, and (5) altering the groundwater flow field and thus radionuclide migration paths. The IFC aims at providing water and gas flow fields as the basis for the subsequent radionuclide transport simulations, which are performed by the radionuclide transport code (RTC). The IFC, RTC and a waste-dissolution and near-field transport model (STMAN) are part of the Integrated Radionuclide Release Code (IRRC), which integrates all safety-relevant features, events, and processes (FEPs). The IRRC is embedded into a Probabilistic Safety Assessment (PSA) computational tool that (1) evaluates alternative conceptual models, scenarios, and disruptive events, and (2) performs Monte-Carlo sampling to account for parametric uncertainties. The preliminary probabilistic safety assessment concept and the role of the IFC are visualized in Figure 1. The IFC was developed based on Nagra's PSA concept. Specifically, as many phenomena as possible are to be directly simulated using a (simplified) process model, which is at the core of the IRRC model. Uncertainty evaluation (scenario uncertainty

  9. Summary Report for ASC L2 Milestone #4782: Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes

    SciTech Connect

    Neely, J. R.; Hornung, R.; Black, A.; Robinson, P.

    2014-09-29

    This document serves as a detailed companion to the powerpoint slides presented as part of the ASC L2 milestone review for Integrated Codes milestone #4782 titled “Assess Newly Emerging Programming and Memory Models for Advanced Architectures on Integrated Codes”, due on 9/30/2014, and presented for formal program review on 9/12/2014. The program review committee is represented by Mike Zika (A Program Project Lead for Kull), Brian Pudliner (B Program Project Lead for Ares), Scott Futral (DEG Group Lead in LC), and Mike Glass (Sierra Project Lead at Sandia). This document, along with the presentation materials, and a letter of completion signed by the review committee will act as proof of completion for this milestone.

  10. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships.

  11. Evaluation of the heat transfer module (FAHT) of Failure Analysis Nonlinear Thermal And Structural Integrated Code (FANTASTIC)

    NASA Technical Reports Server (NTRS)

    Keyhani, Majid

    1989-01-01

    The heat transfer module of FANTASTIC Code (FAHT) is studied and evaluated to the extend possible during the ten weeks duration of this project. A brief background of the previous studies is given and the governing equations as modeled in FAHT are discussed. FAHT's capabilities and limitations based on these equations and its coding methodology are explained in detail. It is established that with improper choice of element size and time step FAHT's temperature field prediction at some nodes will be below the initial condition. The source of this unrealistic temperature prediction is identified and a procedure is proposed for avoiding this phenomenon. It is further shown that the proposed procedure will converge to an accurate prediction upon mesh refinement. Unfortunately due to lack of time FAHT's ability to accurately account for pyrolysis and surface ablation has not been verified. Therefore, at the present time it can be stated with confidence that FAHT can accurately predict the temperature field for a transient multi-dimensional, orthotropic material with directional dependence, variable property, with nonlinear boundary condition. Such a prediction will provide an upper limit for the temperature field in an ablating decomposing nozzle liner. The pore pressure field, however, will not be known.

  12. Towards a Local Integration of Theories: Codes and Praxeologies in the Case of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Gellert, Uwe; Barbe, Joaquim; Espinoza, Lorena

    2013-01-01

    We report on the development of a "language of description" that facilitates an integrated analysis of classroom video data in terms of the quality of the teaching-learning process and the students' access to valued forms of mathematical knowledge. Our research setting is the introduction of software for teachers for improving the mathematical…

  13. Linking deregulation of non-coding RNA to the core pathophysiology of Alzheimer's disease: an integrative review.

    PubMed

    Millan, Mark J

    2017-03-17

    The human genome encodes a vast repertoire of protein non-coding RNAs (ncRNA), some specific to the brain. MicroRNAs, which interfere with the translation of target mRNAs, are of particular interest since their deregulation has been implicated in neurodegenerative disorders like Alzheimer's disease (AD). However, it remains challenging to link the complex body of observations on miRNAs and AD into a coherent framework. Using extensive graphical support, this article discusses how a diverse panoply of miRNAs convergently and divergently impact (and are impacted by) core pathophysiological processes underlying AD: neuroinflammation and oxidative stress; aberrant generation of β-amyloid-42 (Aβ42); anomalies in the production, cleavage and post-translational marking of Tau; impaired clearance of Aβ42 and Tau; perturbation of axonal organisation; disruption of synaptic plasticity; endoplasmic reticulum stress and the unfolded protein response; mitochondrial dysfunction; aberrant induction of cell cycle re-entry; and apoptotic loss of neurons. Intriguingly, some classes of miRNA provoke these cellular anomalies, whereas others act in a counter-regulatory, protective mode. Moreover, changes in levels of certain species of miRNA are a consequence of the above-mentioned anomalies. In addition to miRNAs, circular RNAs, piwiRNAs, long non-coding RNAs and other types of ncRNA are being increasingly implicated in AD. Overall, a complex mesh of deregulated and multi-tasking ncRNAs reciprocally interacts with pathophysiological mechanisms underlying AD. Alterations in ncRNAs can be detected in CSF and the circulation as well as the brain, and are showing promise as biomarkers, with the ultimate goal clinical exploitation as targets for novel modes of symptomatic and course-altering therapy.

  14. Integrative analysis reveals clinical phenotypes and oncogenic potentials of long non-coding RNAs across 15 cancer types

    PubMed Central

    Piccolo, Stephen R.; Zhang, Xiao-Qin; Li, Jun-Hao; Zhou, Hui; Yang, Jian-Hua; Qu, Liang-Hu

    2016-01-01

    Long non-coding RNAs (lncRNAs) have been shown to contribute to tumorigenesis. However, surprisingly little is known about the comprehensive clinical and genomic characterization of lncRNAs across human cancer. In this study, we conducted comprehensive analyses for the expression profile, clinical outcomes, somatic copy number alterations (SCNAs) profile of lncRNAs in ~7000 clinical samples from 15 different cancer types. We identified significantly differentially expressed lncRNAs between tumor and normal tissues from each cancer. Notably, we characterized 47 lncRNAs which were extensively dysregulated in at least 10 cancer types, suggesting a conserved function in cancer development. We also analyzed the associations between lncRNA expressions and patient survival, and identified sets of lncRNAs that possessed significant prognostic values in specific cancer types. Our combined analysis of SCNA data and expression data uncovered 116 dysregulated lncRNAs are strikingly genomic altered across 15 cancer types, indicating their oncogenic potentials. Our study may lay the groundwork for future functional studies of lncRNAs and help facilitate the discovery of novel clinical biomarkers. PMID:27147563

  15. Development and application of a ray-tracing code integrating with 3D equilibrium mapping in LHD ECH experiments

    NASA Astrophysics Data System (ADS)

    Tsujimura, T., Ii; Kubo, S.; Takahashi, H.; Makino, R.; Seki, R.; Yoshimura, Y.; Igami, H.; Shimozuma, T.; Ida, K.; Suzuki, C.; Emoto, M.; Yokoyama, M.; Kobayashi, T.; Moon, C.; Nagaoka, K.; Osakabe, M.; Kobayashi, S.; Ito, S.; Mizuno, Y.; Okada, K.; Ejiri, A.; Mutoh, T.

    2015-11-01

    The central electron temperature has successfully reached up to 7.5 keV in large helical device (LHD) plasmas with a central high-ion temperature of 5 keV and a central electron density of 1.3× {{10}19} m-3. This result was obtained by heating with a newly-installed 154 GHz gyrotron and also the optimisation of injection geometry in electron cyclotron heating (ECH). The optimisation was carried out by using the ray-tracing code ‘LHDGauss’, which was upgraded to include the rapid post-processing three-dimensional (3D) equilibrium mapping obtained from experiments. For ray-tracing calculations, LHDGauss can automatically read the relevant data registered in the LHD database after a discharge, such as ECH injection settings (e.g. Gaussian beam parameters, target positions, polarisation and ECH power) and Thomson scattering diagnostic data along with the 3D equilibrium mapping data. The equilibrium map of the electron density and temperature profiles are then extrapolated into the region outside the last closed flux surface. Mode purity, or the ratio between the ordinary mode and the extraordinary mode, is obtained by calculating the 1D full-wave equation along the direction of the rays from the antenna to the absorption target point. Using the virtual magnetic flux surfaces, the effects of the modelled density profiles and the magnetic shear at the peripheral region with a given polarisation are taken into account. Power deposition profiles calculated for each Thomson scattering measurement timing are registered in the LHD database. The adjustment of the injection settings for the desired deposition profile from the feedback provided on a shot-by-shot basis resulted in an effective experimental procedure.

  16. Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) verification and validation plan. version 1.

    SciTech Connect

    Bartlett, Roscoe Ainsworth; Arguello, Jose Guadalupe, Jr.; Urbina, Angel; Bouchard, Julie F.; Edwards, Harold Carter; Freeze, Geoffrey A.; Knupp, Patrick Michael; Wang, Yifeng; Schultz, Peter Andrew; Howard, Robert; McCornack, Marjorie Turner

    2011-01-01

    The objective of the U.S. Department of Energy Office of Nuclear Energy Advanced Modeling and Simulation Waste Integrated Performance and Safety Codes (NEAMS Waste IPSC) is to provide an integrated suite of computational modeling and simulation (M&S) capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive-waste storage facility or disposal repository. To meet this objective, NEAMS Waste IPSC M&S capabilities will be applied to challenging spatial domains, temporal domains, multiphysics couplings, and multiscale couplings. A strategic verification and validation (V&V) goal is to establish evidence-based metrics for the level of confidence in M&S codes and capabilities. Because it is economically impractical to apply the maximum V&V rigor to each and every M&S capability, M&S capabilities will be ranked for their impact on the performance assessments of various components of the repository systems. Those M&S capabilities with greater impact will require a greater level of confidence and a correspondingly greater investment in V&V. This report includes five major components: (1) a background summary of the NEAMS Waste IPSC to emphasize M&S challenges; (2) the conceptual foundation for verification, validation, and confidence assessment of NEAMS Waste IPSC M&S capabilities; (3) specifications for the planned verification, validation, and confidence-assessment practices; (4) specifications for the planned evidence information management system; and (5) a path forward for the incremental implementation of this V&V plan.

  17. A statistical framework to predict functional non-coding regions in the human genome through integrated analysis of annotation data.

    PubMed

    Lu, Qiongshi; Hu, Yiming; Sun, Jiehuan; Cheng, Yuwei; Cheung, Kei-Hoi; Zhao, Hongyu

    2015-05-27

    Identifying functional regions in the human genome is a major goal in human genetics. Great efforts have been made to functionally annotate the human genome either through computational predictions, such as genomic conservation, or high-throughput experiments, such as the ENCODE project. These efforts have resulted in a rich collection of functional annotation data of diverse types that need to be jointly analyzed for integrated interpretation and annotation. Here we present GenoCanyon, a whole-genome annotation method that performs unsupervised statistical learning using 22 computational and experimental annotations thereby inferring the functional potential of each position in the human genome. With GenoCanyon, we are able to predict many of the known functional regions. The ability of predicting functional regions as well as its generalizable statistical framework makes GenoCanyon a unique and powerful tool for whole-genome annotation. The GenoCanyon web server is available at http://genocanyon.med.yale.edu.

  18. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  19. Space and Terrestrial Power System Integration Optimization Code BRMAPS for Gas Turbine Space Power Plants With Nuclear Reactor Heat Sources

    NASA Technical Reports Server (NTRS)

    Juhasz, Albert J.

    2007-01-01

    In view of the difficult times the US and global economies are experiencing today, funds for the development of advanced fission reactors nuclear power systems for space propulsion and planetary surface applications are currently not available. However, according to the Energy Policy Act of 2005 the U.S. needs to invest in developing fission reactor technology for ground based terrestrial power plants. Such plants would make a significant contribution toward drastic reduction of worldwide greenhouse gas emissions and associated global warming. To accomplish this goal the Next Generation Nuclear Plant Project (NGNP) has been established by DOE under the Generation IV Nuclear Systems Initiative. Idaho National Laboratory (INL) was designated as the lead in the development of VHTR (Very High Temperature Reactor) and HTGR (High Temperature Gas Reactor) technology to be integrated with MMW (multi-megawatt) helium gas turbine driven electric power AC generators. However, the advantages of transmitting power in high voltage DC form over large distances are also explored in the seminar lecture series. As an attractive alternate heat source the Liquid Fluoride Reactor (LFR), pioneered at ORNL (Oak Ridge National Laboratory) in the mid 1960's, would offer much higher energy yields than current nuclear plants by using an inherently safe energy conversion scheme based on the Thorium --> U233 fuel cycle and a fission process with a negative temperature coefficient of reactivity. The power plants are to be sized to meet electric power demand during peak periods and also for providing thermal energy for hydrogen (H2) production during "off peak" periods. This approach will both supply electric power by using environmentally clean nuclear heat which does not generate green house gases, and also provide a clean fuel H2 for the future, when, due to increased global demand and the decline in discovering new deposits, our supply of liquid fossil fuels will have been used up. This is

  20. High Efficiency Integrated Space Conditioning, Water Heating and Air Distribution System for HUD-Code Manufactured Housing

    SciTech Connect

    Henry DeLima; Joe Akin; Joseph Pietsch

    2008-09-14

    Recognizing the need for new space conditioning and water heating systems for manufactured housing, DeLima Associates assembled a team to develop a space conditioning system that would enhance comfort conditions while also reducing energy usage at the systems level. The product, Comboflair® was defined as a result of a needs analysis of project sponsors and industry stakeholders. An integrated system would be developed that would combine a packaged airconditioning system with a small-duct, high-velocity air distribution system. In its basic configuration, the source for space heating would be a gas water heater. The complete system would be installed at the manufactured home factory and would require no site installation work at the homesite as is now required with conventional split-system air conditioners. Several prototypes were fabricated and tested before a field test unit was completed in October 2005. The Comboflair® system, complete with ductwork, was installed in a 1,984 square feet, double-wide manufactured home built by Palm Harbor Homes in Austin, TX. After the home was transported and installed at a Palm Harbor dealer lot in Austin, TX, a data acquisition system was installed for remote data collection. Over 60 parameters were continuously monitored and measurements were transmitted to a remote site every 15 minutes for performance analysis. The Comboflair® system was field tested from February 2006 until April 2007. The cooling system performed in accordance with the design specifications. The heating system initially could not provide the needed capacity at peak heating conditions until the water heater was replaced with a higher capacity standard water heater. All system comfort goals were then met. As a result of field testing, we have identified improvements to be made to specific components for incorporation into production models. The Comboflair® system will be manufactured by Unico, Inc. at their new production facility in St. Louis

  1. Ethical coding.

    PubMed

    Resnik, Barry I

    2009-01-01

    It is ethical, legal, and proper for a dermatologist to maximize income through proper coding of patient encounters and procedures. The overzealous physician can misinterpret reimbursement requirements or receive bad advice from other physicians and cross the line from aggressive coding to coding fraud. Several of the more common problem areas are discussed.

  2. Sharing code.

    PubMed

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing.

  3. Integrated Analysis of the Roles of Long Noncoding RNA and Coding RNA Expression in Sheep (Ovis aries) Skin during Initiation of Secondary Hair Follicle

    PubMed Central

    Liu, Jianbin; Guo, Jian; Feng, Ruilin; Niu, Chune; Sun, Xiaoping; Yang, Bohui

    2016-01-01

    Initiation of hair follicle (HF) is the first and most important stage of HF morphogenesis. However the precise molecular mechanism of initiation of hair follicle remains elusive. Meanwhile, in previous study, the more attentions had been paid to the function of genes, while the roles of non-coding RNAs (such as long noncoding RNA and microRNA) had not been described. Therefore, the roles of long noncoding RNA(LncRNA) and coding RNA in sheep skin during the initiation of sheep secondary HF were integrated and analyzed, by using strand-specific RNA sequencing (ssRNA-seq).A total of 192 significant differentially expressed genes were detected, including 67 up-regulated genes and 125 down-regulated genes between stage 0 and stage 1 of HF morphogenesis during HF initiation. Only Wnt2, FGF20 were just significant differentially expressed among Wnt, Shh, Notch and BMP signaling pathways. Further expression profile analysis of lncRNAs showed that 884 novel lncRNAs were discovered in sheep skin expression profiles. A total of 15 lncRNAs with significant differential expression were detected, 6 up-regulated and 9 down-regulated. Among of differentially expressed genes and LncRNA, XLOC002437 lncRNA and potential target gene COL6A6 were all significantly down-regulated in stage 1. Furthermore, by using RNAhybrid, XLOC005698 may be as a competing endogenous RNA ‘‘sponges” oar-miR-3955-5p activity. Gene Ontology and KEGG pathway analyses indicated that the significantly enriched pathway was peroxisome proliferator-activated receptors (PPARs) pathway (corrected P-value < 0.05), indicating that PPAR pathway is likely to play significant roles during the initiation of secondary HF.Results suggest that the key differentially expressed genes and LncRNAs may be considered as potential candidate genes for further study on the molecular mechanisms of HF initiation, as well as supplying some potential values for understanding human hair disorders. PMID:27276011

  4. Embedded foveation image coding.

    PubMed

    Wang, Z; Bovik, A C

    2001-01-01

    The human visual system (HVS) is highly space-variant in sampling, coding, processing, and understanding. The spatial resolution of the HVS is highest around the point of fixation (foveation point) and decreases rapidly with increasing eccentricity. By taking advantage of this fact, it is possible to remove considerable high-frequency information redundancy from the peripheral regions and still reconstruct a perceptually good quality image. Great success has been obtained previously by a class of embedded wavelet image coding algorithms, such as the embedded zerotree wavelet (EZW) and the set partitioning in hierarchical trees (SPIHT) algorithms. Embedded wavelet coding not only provides very good compression performance, but also has the property that the bitstream can be truncated at any point and still be decoded to recreate a reasonably good quality image. In this paper, we propose an embedded foveation image coding (EFIC) algorithm, which orders the encoded bitstream to optimize foveated visual quality at arbitrary bit-rates. A foveation-based image quality metric, namely, foveated wavelet image quality index (FWQI), plays an important role in the EFIC system. We also developed a modified SPIHT algorithm to improve the coding efficiency. Experiments show that EFIC integrates foveation filtering with foveated image coding and demonstrates very good coding performance and scalability in terms of foveated image quality measurement.

  5. Integration

    ERIC Educational Resources Information Center

    Kalyn, Brenda

    2006-01-01

    Integrated learning is an exciting adventure for both teachers and students. It is not uncommon to observe the integration of academic subjects such as math, science, and language arts. However, educators need to recognize that movement experiences in physical education also can be linked to academic curricula and, may even lead the…

  6. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  7. ITS version 5.0 :the integrated TIGER series of coupled electron/Photon monte carlo transport codes with CAD geometry.

    SciTech Connect

    Franke, Brian Claude; Kensek, Ronald Patrick; Laub, Thomas William

    2005-09-01

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and internal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  8. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  9. Using Coding Apps to Support Literacy Instruction and Develop Coding Literacy

    ERIC Educational Resources Information Center

    Hutchison, Amy; Nadolny, Larysa; Estapa, Anne

    2016-01-01

    In this article the authors present the concept of Coding Literacy and describe the ways in which coding apps can support the development of Coding Literacy and disciplinary and digital literacy skills. Through detailed examples, we describe how coding apps can be integrated into literacy instruction to support learning of the Common Core English…

  10. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  11. Comparison of measured responses in two spectrally-sensitive x-ray detectors to predictions obtained using the ITS (Integrated Tiger Series) radiation transport code

    SciTech Connect

    Carlson, G.A.; Beutler, D.E.; Seager, K.D.; Knott, D.P.

    1988-01-01

    Responses of a Ge detector and a filtered TLD array detector have been measured at a steady-state bremsstrahlung source (the Pelletron), at endpoint energies from 150 to 900 keV. Predictions of detector response using Monte Carlo ITS codes are found to be in excellent agreement with measured response for both detectors. These results extend the range of validity of the ITS codes. With calibration provided by these experiments and by ITS predictions, dose-depth data from the TLD arrays can be used to estimate flash x-ray source endpoint energies.

  12. Generic reactive transport codes as flexible tools to integrate soil organic matter degradation models with water, transport and geochemistry in soils

    NASA Astrophysics Data System (ADS)

    Jacques, Diederik; Gérard, Fréderic; Mayer, Uli; Simunek, Jirka; Leterme, Bertrand

    2016-04-01

    A large number of organic matter degradation, CO2 transport and dissolved organic matter models have been developed during the last decades. However, organic matter degradation models are in many cases strictly hard-coded in terms of organic pools, degradation kinetics and dependency on environmental variables. The scientific input of the model user is typically limited to the adjustment of input parameters. In addition, the coupling with geochemical soil processes including aqueous speciation, pH-dependent sorption and colloid-facilitated transport are not incorporated in many of these models, strongly limiting the scope of their application. Furthermore, the most comprehensive organic matter degradation models are combined with simplified representations of flow and transport processes in the soil system. We illustrate the capability of generic reactive transport codes to overcome these shortcomings. The formulations of reactive transport codes include a physics-based continuum representation of flow and transport processes, while biogeochemical reactions can be described as equilibrium processes constrained by thermodynamic principles and/or kinetic reaction networks. The flexibility of these type of codes allows for straight-forward extension of reaction networks, permits the inclusion of new model components (e.g.: organic matter pools, rate equations, parameter dependency on environmental conditions) and in such a way facilitates an application-tailored implementation of organic matter degradation models and related processes. A numerical benchmark involving two reactive transport codes (HPx and MIN3P) demonstrates how the process-based simulation of transient variably saturated water flow (Richards equation), solute transport (advection-dispersion equation), heat transfer and diffusion in the gas phase can be combined with a flexible implementation of a soil organic matter degradation model. The benchmark includes the production of leachable organic matter

  13. Code Green.

    ERIC Educational Resources Information Center

    McMinn, John

    2002-01-01

    Assesses the integrated approach to green design in the new Computer Science Building at Toronto's York University. The building design fulfills the university's demand to combine an energy efficient design with sustainability. Floor and site plans are included. (GR)

  14. National Combustion Code: Parallel Performance

    NASA Technical Reports Server (NTRS)

    Babrauckas, Theresa

    2001-01-01

    This report discusses the National Combustion Code (NCC). The NCC is an integrated system of codes for the design and analysis of combustion systems. The advanced features of the NCC meet designers' requirements for model accuracy and turn-around time. The fundamental features at the inception of the NCC were parallel processing and unstructured mesh. The design and performance of the NCC are discussed.

  15. A general multiblock Euler code for propulsion integration. Volume 2: User guide for BCON, pre-processor for grid generation and GMBE

    NASA Technical Reports Server (NTRS)

    Su, T. Y.; Appleby, R. A.; Chen, H. C.

    1991-01-01

    The BCON is a menu-driven graphics interface program whose input consists of strings or arrays of points generated from a computer aided design (CAD) tool or any other surface geometry source. The user needs to design the block topology and prepare the surface geometry definition and surface grids separately. The BCON generates input files that contain the block definitions and the block relationships required for generating a multiblock volume grid with the EAGLE grid generation package. The BCON also generates the block boundary conditions file which is used along with the block relationship file as input for the general multiblock Euler (GMBE) code (GMBE, volumes 1 and 3).

  16. Integration of a code for aeroelastic design of conventional and composite wings into ACSYNT, an aircraft synthesis program. [wing aeroelastic design (WADES)

    NASA Technical Reports Server (NTRS)

    Mullen, J., Jr.

    1976-01-01

    A comparison of program estimates of wing weight, material distribution. structural loads and elastic deformations with actual Northrop F-5A/B data is presented. Correlation coefficients obtained using data from a number of existing aircraft were computed for use in vehicle synthesis to estimate wing weights. The modifications necessary to adapt the WADES code for use in the ACSYNT program are described. Basic program flow and overlay structure is outlined. An example of the convergence of the procedure in estimating wing weights during the synthesis of a vehicle to satisfy F-5 mission requirements is given. A description of inputs required for use of the WADES program is included.

  17. Study Drugs and Academic Integrity: The Role of Beliefs about an Academic Honor Code in the Prediction of Nonmedical Prescription Drug Use for Academic Enhancement

    ERIC Educational Resources Information Center

    Reisinger, Kelsy B.; Rutledge, Patricia C.; Conklin, Sarah M.

    2016-01-01

    The role of beliefs about academic integrity in college students' decisions to use nonmedical prescription drugs (NMPDs) in academic settings was examined. In Spring 2012 the authors obtained survey data from 645 participants at a small, undergraduate, private liberal arts institution in the Northeastern United States. A broadcast e-mail message…

  18. Video coding with dynamic background

    NASA Astrophysics Data System (ADS)

    Paul, Manoranjan; Lin, Weisi; Lau, Chiew Tong; Lee, Bu-Sung

    2013-12-01

    Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

  19. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  20. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  1. Integrating the intrinsic conformational preferences of non-coded α-amino acids modified at the peptide bond into the NCAD database

    PubMed Central

    Revilla-López, Guillem; Rodríguez-Ropero, Francisco; Curcó, David; Torras, Juan; Calaza, M. Isabel; Zanuy, David; Jiménez, Ana I.; Cativiela, Carlos; Nussinov, Ruth; Alemán, Carlos

    2011-01-01

    Recently, we reported a database (NCAD, Non-Coded Amino acids Database; http://recerca.upc.edu/imem/index.htm) that was built to compile information about the intrinsic conformational preferences of non-proteinogenic residues determined by quantum mechanical calculations, as well as bibliographic information about their synthesis, physical and spectroscopic characterization, the experimentally-established conformational propensities, and applications (J. Phys. Chem. B 2010, 114, 7413). The database initially contained the information available for α-tetrasubstituted α-amino acids. In this work, we extend NCAD to three families of compounds, which can be used to engineer peptides and proteins incorporating modifications at the –NHCO– peptide bond. Such families are: N-substituted α-amino acids, thio-α-amino acids, and diamines and diacids used to build retropeptides. The conformational preferences of these compounds have been analyzed and described based on the information captured in the database. In addition, we provide an example of the utility of the database and of the compounds it compiles in protein and peptide engineering. Specifically, the symmetry of a sequence engineered to stabilize the 310-helix with respect to the α-helix has been broken without perturbing significantly the secondary structure through targeted replacements using the information contained in the database. PMID:21491493

  2. The moving mesh code SHADOWFAX

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; De Rijcke, S.

    2016-07-01

    We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  3. ETR/ITER systems code

    SciTech Connect

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  4. Implementation of a Variable Stepsize Variable Formula Method in the Time-Integration Part of a Code for Treatment of Long-Range Transport of Air Pollutants

    NASA Astrophysics Data System (ADS)

    Zlatev, Zahari; Berkowicz, Ruwim; Prahm, Lars P.

    1984-08-01

    A mathematical model consisting of two partial differential equations is used to study the long-range transport of sulphur di-oxide and sulphate over Europe. The discretization of the first-order space derivatives ( the advection terms) is carried out by a pseudospectral (Fourier) algorithm. A special technique is applied in the discretization of the second-order space derivatives ( the diffusion terms). Two large systems of ordinary differential equations are solved at each time-step. It is shown that these systems can efficiently be treated by a variable stepsize variable formula method (VSVFM) based on the use of predictor-corrector schemes. The stepsize selection strategy and the formula selection strategy are discussed in detail. An attempt to carry out both an accuracy control and a stability control is made at each time-step. The great efficiency of the VSVFM implemented in our software as well as the reliability of the results are illustrated by numerical experiments, in which real meteorological data (for 1979) at the grid-points of a space domain covering the whole of Europe were used. The main ideas, implemented in the time-integration part, might be applied in many other situations, where the systems of ordinary differential equations arising after the space discretization are only moderately stiff (so that the stability requirements are dominant over the accuracy requirements on a large part of the time-interval but the use of implicit time-integration algorithms that require solving systems of algebraic equations at each time-step is not justified). As an illustration only it should be mentioned that such an application has been carried out in connection with models describing long-range transport of nitrogen pollutants over Europe.

  5. Diagnostic Coding for Epilepsy.

    PubMed

    Williams, Korwyn; Nuwer, Marc R; Buchhalter, Jeffrey R

    2016-02-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue.

  6. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  7. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints.

  8. GalPot: Galaxy potential code

    NASA Astrophysics Data System (ADS)

    McMillan, Paul J.

    2016-11-01

    GalPot finds the gravitational potential associated with axisymmetric density profiles. The package includes code that performs transformations between commonly used coordinate systems for both positions and velocities (the class OmniCoords), and that integrates orbits in the potentials. GalPot is a stand-alone version of Walter Dehnen's GalaxyPotential C++ code taken from the falcON code in the NEMO Stellar Dynamics Toolbox (ascl:1010.051).

  9. Compact, Flexible Telemetry-Coding Circuits

    NASA Technical Reports Server (NTRS)

    Katz, Richard B.; Tooley, Matthew; Settles, Beverly

    1993-01-01

    Circuits encoding binary telemetry data designed to synthesize any number of selectable codes. Designed for use aboard spacecraft, with features also making them attractive for terrestrial applications: Simple and compact relative to prior coding circuits, built with commercial integrated circuits, and incorporate protective redundancy. Output distortions minimized, and spurious attenuated and/or abbreviated output pulses eliminated.

  10. CFD Code Development for Combustor Flows

    NASA Technical Reports Server (NTRS)

    Norris, Andrew

    2003-01-01

    During the lifetime of this grant, work has been performed in the areas of model development, code development, code validation and code application. For model development, this has included the PDF combustion module, chemical kinetics based on thermodynamics, neural network storage of chemical kinetics, ILDM chemical kinetics and assumed PDF work. Many of these models were then implemented in the code, and in addition many improvements were made to the code, including the addition of new chemistry integrators, property evaluation schemes, new chemistry models and turbulence-chemistry interaction methodology. Validation of all new models and code improvements were also performed, while application of the code to the ZCET program and also the NPSS GEW combustor program were also performed. Several important items remain under development, including the NOx post processing, assumed PDF model development and chemical kinetic development. It is expected that this work will continue under the new grant.

  11. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  12. Rotating-Pump Design Code

    NASA Technical Reports Server (NTRS)

    Walker, James F.; Chen, Shu-Cheng; Scheer, Dean D.

    2006-01-01

    Pump Design (PUMPDES) is a computer program for designing a rotating pump for liquid hydrogen, liquid oxygen, liquid nitrogen, water, methane, or ethane. Using realistic properties of these fluids provided by another program called GASPAK, this code performs a station-by-station, mean-line analysis along the pump flow path, obtaining thermodynamic properties of the pumped fluid at each station and evaluating hydraulic losses along the flow path. The variables at each station are obtained under constraints that are consistent with the underlying physical principles. The code evaluates the performance of each stage and the overall pump. In addition, by judiciously choosing the givens and the unknowns, the code can perform a geometric inverse design function: that is, it can compute a pump geometry that yields a closest approximation of given design point. The code contains two major parts: one for an axial-rotor/inducer and one for a multistage centrifugal pump. The inducer and the centrifugal pump are functionally integrated. The code can be used in designing and/or evaluating the inducer/centrifugal-pump combination or the centrifugal pump alone. The code is written in standard Fortran 77.

  13. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  14. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  15. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  16. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  17. Interval coding. II. Dendrite-dependent mechanisms.

    PubMed

    Doiron, Brent; Oswald, Anne-Marie M; Maler, Leonard

    2007-04-01

    The rich temporal structure of neural spike trains provides multiple dimensions to code dynamic stimuli. Popular examples are spike trains from sensory cells where bursts and isolated spikes can serve distinct coding roles. In contrast to analyses of neural coding, the cellular mechanics of burst mechanisms are typically elucidated from the neural response to static input. Bridging the mechanics of bursting with coding of dynamic stimuli is an important step in establishing theories of neural coding. Electrosensory lateral line lobe (ELL) pyramidal neurons respond to static inputs with a complex dendrite-dependent burst mechanism. Here we show that in response to dynamic broadband stimuli, these bursts lack some of the electrophysiological characteristics observed in response to static inputs. A simple leaky integrate-and-fire (LIF)-style model with a dendrite-dependent depolarizing afterpotential (DAP) is sufficient to match both the output statistics and coding performance of experimental spike trains. We use this model to investigate a simplification of interval coding where the burst interspike interval (ISI) codes for the scale of a canonical upstroke rather than a multidimensional stimulus feature. Using this stimulus reduction, we compute a quantization of the burst ISIs and the upstroke scale to show that the mutual information rate of the interval code is maximized at a moderate DAP amplitude. The combination of a reduced description of ELL pyramidal cell bursting and a simplification of the interval code increases the generality of ELL burst codes to other sensory modalities.

  18. A draft model aggregated code of ethics for bioethicists.

    PubMed

    Baker, Robert

    2005-01-01

    Bioethicists function in an environment in which their peers--healthcare executives, lawyers, nurses, physicians--assert the integrity of their fields through codes of professional ethics. Is it time for bioethics to assert its integrity by developing a code of ethics? Answering in the affirmative, this paper lays out a case by reviewing the historical nature and function of professional codes of ethics. Arguing that professional codes are aggregative enterprises growing in response to a field's historical experiences, it asserts that bioethics now needs to assert its integrity and independence and has already developed a body of formal statements that could be aggregated to create a comprehensive code of ethics for bioethics. A Draft Model Aggregated Code of Ethics for Bioethicists is offered in the hope that analysis and criticism of this draft code will promote further discussion of the nature and content of a code of ethics for bioethicists.

  19. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  20. FLOWTRAN-TF code benchmarking

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss Of Coolant Accident (LOCA). A description of the code is given by Flach et al. (1990). This report provides benchmarking results for the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit (Smith et al., 1990a; 1990b). Individual constitutive relations are benchmarked in Sections 2 through 5 while in Sections 6 and 7 integral code benchmarking results are presented. An overall assessment of FLOWTRAN-TF for its intended use in computing the ECS power limit completes the document.

  1. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  2. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  3. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  4. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  5. Predictive coding of multisensory timing

    PubMed Central

    Shi, Zhuanghua; Burr, David

    2016-01-01

    The sense of time is foundational for perception and action, yet it frequently departs significantly from physical time. In the paper we review recent progress on temporal contextual effects, multisensory temporal integration, temporal recalibration, and related computational models. We suggest that subjective time arises from minimizing prediction errors and adaptive recalibration, which can be unified in the framework of predictive coding, a framework rooted in Helmholtz’s ‘perception as inference’. PMID:27695705

  6. Driver Code for Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Rao, Shanti

    2007-01-01

    A special-purpose computer code for a deformable-mirror adaptive-optics control system transmits pixel-registered control from (1) a personal computer running software that generates the control data to (2) a circuit board with 128 digital-to-analog converters (DACs) that generate voltages to drive the deformable-mirror actuators. This program reads control-voltage codes from a text file, then sends them, via the computer s parallel port, to a circuit board with four AD5535 (or equivalent) chips. Whereas a similar prior computer program was capable of transmitting data to only one chip at a time, this program can send data to four chips simultaneously. This program is in the form of C-language code that can be compiled and linked into an adaptive-optics software system. The program as supplied includes source code for integration into the adaptive-optics software, documentation, and a component that provides a demonstration of loading DAC codes from a text file. On a standard Windows desktop computer, the software can update 128 channels in 10 ms. On Real-Time Linux with a digital I/O card, the software can update 1024 channels (8 boards in parallel) every 8 ms.

  7. Efficient entropy coding for scalable video coding

    NASA Astrophysics Data System (ADS)

    Choi, Woong Il; Yang, Jungyoup; Jeon, Byeungwoo

    2005-10-01

    The standardization for the scalable extension of H.264 has called for additional functionality based on H.264 standard to support the combined spatio-temporal and SNR scalability. For the entropy coding of H.264 scalable extension, Context-based Adaptive Binary Arithmetic Coding (CABAC) scheme is considered so far. In this paper, we present a new context modeling scheme by using inter layer correlation between the syntax elements. As a result, it improves coding efficiency of entropy coding in H.264 scalable extension. In simulation results of applying the proposed scheme to encoding the syntax element mb_type, it is shown that improvement in coding efficiency of the proposed method is up to 16% in terms of bit saving due to estimation of more adequate probability model.

  8. Integrated Coding and Waveform Design Study.

    DTIC Science & Technology

    1980-08-01

    Ou... Dec.de.,.. ... .1. . 10 0o - 2 - - - 7- - - 0 1 1 CcNLT OS RI ERIFRRINBT(B Fiur 8.......2 ecoded ....... Perfomanc ...f. a Tw...10 A: iiSW fit E~rro" Rate 3: No Inner Coda Co vlutional outer ’ i . . . . . . D ecoder , R-0.5 0 I C: Sinary Inner 0ecode...Convolut~ional Outer Decoder, R-O.25 ’. .l . . .. . . .. . . . . . 0: Channel M easurem ent Tinner Decoder, - 9Con cu citon al Oute r ecode , R

  9. Roanoke College Student Conduct Code 1990-91.

    ERIC Educational Resources Information Center

    Roanoke Coll., VA.

    This Roanoke College (Virginia) 1990-91 conduct code manual is intended for distribution to students. A reproduction of the Academic Integrity and Student Conduct Code Form which all students must sign leads off the document. A section detailing the student conduct code explains the delegation of authority within the institution and describes the…

  10. Fast transform decoding of nonsystematic Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Truong, T. K.; Cheung, K.-M.; Reed, I. S.; Shiozaki, A.

    1989-01-01

    A Reed-Solomon (RS) code is considered to be a special case of a redundant residue polynomial (RRP) code, and a fast transform decoding algorithm to correct both errors and erasures is presented. This decoding scheme is an improvement of the decoding algorithm for the RRP code suggested by Shiozaki and Nishida, and can be realized readily on very large scale integration chips.

  11. Honesty and Honor Codes.

    ERIC Educational Resources Information Center

    McCabe, Donald; Trevino, Linda Klebe

    2002-01-01

    Explores the rise in student cheating and evidence that students cheat less often at schools with an honor code. Discusses effective use of such codes and creation of a peer culture that condemns dishonesty. (EV)

  12. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  13. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  14. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  15. DIANE multiparticle transport code

    NASA Astrophysics Data System (ADS)

    Caillaud, M.; Lemaire, S.; Ménard, S.; Rathouit, P.; Ribes, J. C.; Riz, D.

    2014-06-01

    DIANE is the general Monte Carlo code developed at CEA-DAM. DIANE is a 3D multiparticle multigroup code. DIANE includes automated biasing techniques and is optimized for massive parallel calculations.

  16. EMF wire code research

    SciTech Connect

    Jones, T.

    1993-11-01

    This paper examines the results of previous wire code research to determines the relationship with childhood cancer, wire codes and electromagnetic fields. The paper suggests that, in the original Savitz study, biases toward producing a false positive association between high wire codes and childhood cancer were created by the selection procedure.

  17. Universal Noiseless Coding Subroutines

    NASA Technical Reports Server (NTRS)

    Schlutsmeyer, A. P.; Rice, R. F.

    1986-01-01

    Software package consists of FORTRAN subroutines that perform universal noiseless coding and decoding of integer and binary data strings. Purpose of this type of coding to achieve data compression in sense that coded data represents original data perfectly (noiselessly) while taking fewer bits to do so. Routines universal because they apply to virtually any "real-world" data source.

  18. Mapping Local Codes to Read Codes.

    PubMed

    Bonney, Wilfred; Galloway, James; Hall, Christopher; Ghattas, Mikhail; Tramma, Leandro; Nind, Thomas; Donnelly, Louise; Jefferson, Emily; Doney, Alexander

    2017-01-01

    Background & Objectives: Legacy laboratory test codes make it difficult to use clinical datasets for meaningful translational research, where populations are followed for disease risk and outcomes over many years. The Health Informatics Centre (HIC) at the University of Dundee hosts continuous biochemistry data from the clinical laboratories in Tayside and Fife dating back as far as 1987. However, the HIC-managed biochemistry dataset is coupled with incoherent sample types and unstandardised legacy local test codes, which increases the complexity of using the dataset for reasonable population health outcomes. The objective of this study was to map the legacy local test codes to the Scottish 5-byte Version 2 Read Codes using biochemistry data extracted from the repository of the Scottish Care Information (SCI) Store.

  19. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  20. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  1. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  2. Defeating the coding monsters.

    PubMed

    Colt, Ross

    2007-02-01

    Accuracy in coding is rapidly becoming a required skill for military health care providers. Clinic staffing, equipment purchase decisions, and even reimbursement will soon be based on the coding data that we provide. Learning the complicated myriad of rules to code accurately can seem overwhelming. However, the majority of clinic visits in a typical outpatient clinic generally fall into two major evaluation and management codes, 99213 and 99214. If health care providers can learn the rules required to code a 99214 visit, then this will provide a 90% solution that can enable them to accurately code the majority of their clinic visits. This article demonstrates a step-by-step method to code a 99214 visit, by viewing each of the three requirements as a monster to be defeated.

  3. MHDust: A 3-fluid dusty plasma code

    NASA Astrophysics Data System (ADS)

    Lazerson, Samuel

    MHDust is a next generation 3-fluid magnetized dusty plasma code, treating the inertial dynamics of both the dust and ion components. Coded in ANSI C, the numerical method employs Leap-Frog and Dufort-Frankel integration schemes. Features include: nonlinear collisional terms, quasi-neutrality or continuity based electron densities, and dynamical dust charge number. Tests of wave-mode propagation (Acoustic and Electromagnetic) allow a comparison to linear wave mode theory. Additional nonlinear phenomena are presented including magnetic reconnection and shear-flow instabilities. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (DENISIS). The utility of the code is expanded through the possibility of small dust mass. This allows MH- Dust to be used as a 2-ion plasma code. MHDust considerably expands the range of numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

  4. CATHARE code development and assessment methodologies

    SciTech Connect

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-12-31

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l`Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation.

  5. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  6. GeoPhysical Analysis Code

    SciTech Connect

    2011-05-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-Written components: (i) a set of standard finite elements physics solvers for rersolving Darcy fluid flow, explicit mechanics, implicit mechanics, and fluid-mediated fracturing, including resolution of contact both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems and problems involving hydraulic fracturing, where the mesh topology is dynamically changed. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for, e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release.

  7. Uncertainty and Sensitivity Analyses Plan

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy's (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  8. Uncertainty and Sensitivity Analyses Plan. Draft for Peer Review: Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Simpson, J.C.; Ramsdell, J.V. Jr.

    1993-04-01

    Hanford Environmental Dose Reconstruction (HEDR) Project staff are developing mathematical models to be used to estimate the radiation dose that individuals may have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. An uncertainty and sensitivity analyses plan is essential to understand and interpret the predictions from these mathematical models. This is especially true in the case of the HEDR models where the values of many parameters are unknown. This plan gives a thorough documentation of the uncertainty and hierarchical sensitivity analysis methods recommended for use on all HEDR mathematical models. The documentation includes both technical definitions and examples. In addition, an extensive demonstration of the uncertainty and sensitivity analysis process is provided using actual results from the Hanford Environmental Dose Reconstruction Integrated Codes (HEDRIC). This demonstration shows how the approaches used in the recommended plan can be adapted for all dose predictions in the HEDR Project.

  9. Estimation of 1945 to 1957 food consumption

    SciTech Connect

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    1993-03-01

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.

  10. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project

    SciTech Connect

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    1993-07-01

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.

  11. Estimation of 1945 to 1957 food consumption. Hanford Environmental Dose Reconstruction Project: Draft

    SciTech Connect

    Anderson, D.M.; Bates, D.J.; Marsh, T.L.

    1993-03-01

    This report details the methods used and the results of the study on the estimated historic levels of food consumption by individuals in the Hanford Environmental Dose Reconstruction (HEDR) study area from 1945--1957. This period includes the time of highest releases from Hanford and is the period for which data are being collected in the Hanford Thyroid Disease Study. These estimates provide the food-consumption inputs for the HEDR database of individual diets. This database will be an input file in the Hanford Environmental Dose Reconstruction Integrated Code (HEDRIC) computer model that will be used to calculate the radiation dose. The report focuses on fresh milk, eggs, lettuce, and spinach. These foods were chosen because they have been found to be significant contributors to radiation dose based on the Technical Steering Panel dose decision level.

  12. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  13. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  14. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  15. Development of probabilistic multimedia multipathway computer codes.

    SciTech Connect

    Yu, C.; LePoire, D.; Gnanapragasam, E.; Arnish, J.; Kamboj, S.; Biwer, B. M.; Cheng, J.-J.; Zielen, A. J.; Chen, S. Y.; Mo, T.; Abu-Eid, R.; Thaggard, M.; Sallo, A., III.; Peterson, H., Jr.; Williams, W. A.; Environmental Assessment; NRC; EM

    2002-01-01

    The deterministic multimedia dose/risk assessment codes RESRAD and RESRAD-BUILD have been widely used for many years for evaluation of sites contaminated with residual radioactive materials. The RESRAD code applies to the cleanup of sites (soils) and the RESRAD-BUILD code applies to the cleanup of buildings and structures. This work describes the procedure used to enhance the deterministic RESRAD and RESRAD-BUILD codes for probabilistic dose analysis. A six-step procedure was used in developing default parameter distributions and the probabilistic analysis modules. These six steps include (1) listing and categorizing parameters; (2) ranking parameters; (3) developing parameter distributions; (4) testing parameter distributions for probabilistic analysis; (5) developing probabilistic software modules; and (6) testing probabilistic modules and integrated codes. The procedures used can be applied to the development of other multimedia probabilistic codes. The probabilistic versions of RESRAD and RESRAD-BUILD codes provide tools for studying the uncertainty in dose assessment caused by uncertain input parameters. The parameter distribution data collected in this work can also be applied to other multimedia assessment tasks and multimedia computer codes.

  16. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  17. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  18. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  19. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  20. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  1. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  2. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  3. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding.

    PubMed

    Gao, Yuan; Liu, Pengyu; Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions.

  4. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  5. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  6. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  7. Coding Acoustic Metasurfaces.

    PubMed

    Xie, Boyang; Tang, Kun; Cheng, Hua; Liu, Zhengyou; Chen, Shuqi; Tian, Jianguo

    2017-02-01

    Coding acoustic metasurfaces can combine simple logical bits to acquire sophisticated functions in wave control. The acoustic logical bits can achieve a phase difference of exactly π and a perfect match of the amplitudes for the transmitted waves. By programming the coding sequences, acoustic metasurfaces with various functions, including creating peculiar antenna patterns and waves focusing, have been demonstrated.

  8. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses.

  9. Pseudonoise code tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T. (Inventor)

    1980-01-01

    A delay-locked loop is presented for tracking a pseudonoise (PN) reference code in an incoming communication signal. The loop is less sensitive to gain imbalances, which can otherwise introduce timing errors in the PN reference code formed by the loop.

  10. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  11. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  12. Recent developments in the Los Alamos radiation transport code system

    SciTech Connect

    Forster, R.A.; Parsons, K.

    1997-06-01

    A brief progress report on updates to the Los Alamos Radiation Transport Code System (LARTCS) for solving criticality and fixed-source problems is provided. LARTCS integrates the Diffusion Accelerated Neutral Transport (DANT) discrete ordinates codes with the Monte Carlo N-Particle (MCNP) code. The LARCTS code is being developed with a graphical user interface for problem setup and analysis. Progress in the DANT system for criticality applications include a two-dimensional module which can be linked to a mesh-generation code and a faster iteration scheme. Updates to MCNP Version 4A allow statistical checks of calculated Monte Carlo results.

  13. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  14. Benchmarking the Multidimensional Stellar Implicit Code MUSIC

    NASA Astrophysics Data System (ADS)

    Goffrey, T.; Pratt, J.; Viallet, M.; Baraffe, I.; Popov, M. V.; Walder, R.; Folini, D.; Geroux, C.; Constantino, T.

    2017-03-01

    We present the results of a numerical benchmark study for the MUltidimensional Stellar Implicit Code (MUSIC) based on widely applicable two- and three-dimensional compressible hydrodynamics problems relevant to stellar interiors. MUSIC is an implicit large eddy simulation code that uses implicit time integration, implemented as a Jacobian-free Newton Krylov method. A physics based preconditioning technique which can be adjusted to target varying physics is used to improve the performance of the solver. The problems used for this benchmark study include the Rayleigh-Taylor and Kelvin-Helmholtz instabilities, and the decay of the Taylor-Green vortex. Additionally we show a test of hydrostatic equilibrium, in a stellar environment which is dominated by radiative effects. In this setting the flexibility of the preconditioning technique is demonstrated. This work aims to bridge the gap between the hydrodynamic test problems typically used during development of numerical methods and the complex flows of stellar interiors. A series of multidimensional tests were performed and analysed. Each of these test cases was analysed with a simple, scalar diagnostic, with the aim of enabling direct code comparisons. As the tests performed do not have analytic solutions, we verify MUSIC by comparing it to established codes including ATHENA and the PENCIL code. MUSIC is able to both reproduce behaviour from established and widely-used codes as well as results expected from theoretical predictions. This benchmarking study concludes a series of papers describing the development of the MUSIC code and provides confidence in future applications.

  15. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  16. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  17. Parallel CARLOS-3D code development

    SciTech Connect

    Putnam, J.M.; Kotulski, J.D.

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions to the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.

  18. Code Disentanglement: Initial Plan

    SciTech Connect

    Wohlbier, John Greaton; Kelley, Timothy M.; Rockefeller, Gabriel M.; Calef, Matthew Thomas

    2015-01-27

    The first step to making more ambitious changes in the EAP code base is to disentangle the code into a set of independent, levelized packages. We define a package as a collection of code, most often across a set of files, that provides a defined set of functionality; a package a) can be built and tested as an entity and b) fits within an overall levelization design. Each package contributes one or more libraries, or an application that uses the other libraries. A package set is levelized if the relationships between packages form a directed, acyclic graph and each package uses only packages at lower levels of the diagram (in Fortran this relationship is often describable by the use relationship between modules). Independent packages permit independent- and therefore parallel|development. The packages form separable units for the purposes of development and testing. This is a proven path for enabling finer-grained changes to a complex code.

  19. Compressible Astrophysics Simulation Code

    SciTech Connect

    Howell, L.; Singer, M.

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  20. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  1. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  2. RAMONA-4B code for BWR systems analysis

    SciTech Connect

    Cheng, H.S.; Rohatgi, U.S.

    1996-12-31

    The RAMONA-4B code is a coupled thermal-hydraulic, 3D kinetics code for plant transient analyses of a complete Boiling Water Reactor (BWR) system including Reactor Pressure Vessel (RPV), Balance of Plant (BOP) and containment. The complete system representation enables an integrated and coupled systems analysis of a BWR without recourse to prescribed boundary conditions.

  3. Upgrades to the WIMS-ANL code.

    SciTech Connect

    Woodruff, W. L.

    1998-10-14

    The dusty old source code in WIMS-D4M has been completely rewritten to conform more closely with current FORTRAN coding practices. The revised code contains many improvements in appearance, error checking and in control of the output. The output is now tabulated to fit the typical 80 column window or terminal screen. The Segev method for resonance integral interpolation is now an option. Most of the dimension limitations have been removed and replaced with variable dimensions within a compile-time fixed container. The library is no longer restricted to the 69 energy group structure, and two new libraries have been generated for use with the code. The new libraries are both based on ENDF/B-VI data with one having the original 69 energy group structure and the second with a 172 group structure. The common source code can be used with PCs using both Windows 95 and NT, with a Linux based operating system and with UNIX based workstations. Comparisons of this version of the code to earlier evaluations with ENDF/B-V are provided, as well as, comparisons with the new libraries.

  4. Boundary-Layer Code For Supersonic Combustion

    NASA Technical Reports Server (NTRS)

    Pinckney, S. Z.; Walton, J. T.

    1994-01-01

    HUD is integral computer code based on Spaulding-Chi method for predicting development of boundary layers in laminar, transitional, and turbulent regions of flows on two-dimensional or axisymmetric bodies. Approximates nonequilibrium velocity profiles as well as local surface friction in presence of pressure gradient. Predicts transfer of heat in turbulent boundary layer in presence of high axial presure gradient. Provides for pressure gradients both normal and lateral to surfaces. Also used to estimate requirements for cooling scramjet engines. Because of this capability, HUD program incorporated into several scramjet-cycle-performance-analysis codes, including SCRAM (ARC-12338) and SRGULL (LEW-15093). Written in FORTRAN 77.

  5. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  6. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  7. Coded source neutron imaging

    NASA Astrophysics Data System (ADS)

    Bingham, Philip; Santos-Villalobos, Hector; Tobin, Ken

    2011-03-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100μm) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100μm and 10μm aperture hole diameters show resolutions matching the hole diameters.

  8. GeoPhysical Analysis Code

    SciTech Connect

    2012-12-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-written components: (i) a set of standard finite, discrete, and discontinuous displacement element physics solvers for resolving Darcy fluid flow, explicit mechanics, implicit mechanics, fault rupture and earthquake nucleation, and fluid-mediated fracturing, including resolution of physcial behaviors both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problems; ploblems involving hydraulic fracturing, where the mesh topology is dynamically changed; fault rupture modeling and seismic risk assessment; and general granular materials behavior. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for , e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release. CPAC's secondary applications include modeling fault evolution for predicting the statistical distribution of earthquake events and to capture granular materials behavior under different load paths.

  9. The Clawpack Community of Codes

    NASA Astrophysics Data System (ADS)

    Mandli, K. T.; LeVeque, R. J.; Ketcheson, D.; Ahmadia, A. J.

    2014-12-01

    Clawpack, the Conservation Laws Package, has long been one of the standards for solving hyperbolic conservation laws but over the years has extended well beyond this role. Today a community of open-source codes have been developed that address a multitude of different needs including non-conservative balance laws, high-order accurate methods, and parallelism while remaining extensible and easy to use, largely by the judicious use of Python and the original Fortran codes that it wraps. This talk will present some of the recent developments in projects under the Clawpack umbrella, notably the GeoClaw and PyClaw projects. GeoClaw was originally developed as a tool for simulating tsunamis using adaptive mesh refinement but has since encompassed a large number of other geophysically relevant flows including storm surge and debris-flows. PyClaw originated as a Python version of the original Clawpack algorithms but has since been both a testing ground for new algorithmic advances in the Clawpack framework but also an easily extensible framework for solving hyperbolic balance laws. Some of these extensions include the addition of WENO high-order methods, massively parallel capabilities, and adaptive mesh refinement technologies, made possible largely by the flexibility of the Python language and community libraries such as NumPy and PETSc. Because of the tight integration with Python tecnologies, both packages have benefited also from the focus on reproducibility in the Python community, notably IPython notebooks.

  10. NSCool: Neutron star cooling code

    NASA Astrophysics Data System (ADS)

    Page, Dany

    2016-09-01

    NSCool is a 1D (i.e., spherically symmetric) neutron star cooling code written in Fortran 77. The package also contains a series of EOSs (equation of state) to build stars, a series of pre-built stars, and a TOV (Tolman- Oppenheimer-Volkoff) integrator to build stars from an EOS. It can also handle “strange stars” that have a huge density discontinuity between the quark matter and the covering thin baryonic crust. NSCool solves the heat transport and energy balance equations in whole GR, resulting in a time sequence of temperature profiles (and, in particular, a Teff - age curve). Several heating processes are included, and more can easily be incorporated. In particular it can evolve a star undergoing accretion with the resulting deep crustal heating, under a steady or time-variable accretion rate. NSCool is robust, very fast, and highly modular, making it easy to add new subroutines for new processes.

  11. Noiseless coding for the magnetometer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1987-01-01

    Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

  12. Multichannel Error Correction Code Decoder

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Digital Systems Technology Branch has an ongoing program in modulation, coding, onboard processing, and switching. Recently, NASA completed a project to incorporate a time-shared decoder into the very-small-aperture terminal (VSAT) onboard-processing mesh architecture. The primary goal was to demonstrate a time-shared decoder for a regenerative satellite that uses asynchronous, frequency-division multiple access (FDMA) uplink channels, thereby identifying hardware and power requirements and fault-tolerant issues that would have to be addressed in a operational system. A secondary goal was to integrate and test, in a system environment, two NASA-sponsored, proof-of-concept hardware deliverables: the Harris Corp. high-speed Bose Chaudhuri-Hocquenghem (BCH) codec and the TRW multichannel demultiplexer/demodulator (MCDD). A beneficial byproduct of this project was the development of flexible, multichannel-uplink signal-generation equipment.

  13. C++ Coding Standards for the AMP Project

    SciTech Connect

    Evans, Thomas M; Clarno, Kevin T

    2009-09-01

    This document provides an initial starting point to define the C++ coding standards used by the AMP nuclear fuel performance integrated code project and a part of AMP's software development process. This document draws from the experiences, and documentation [1], of the developers of the Marmot Project at Los Alamos National Laboratory. Much of the software in AMP will be written in C++. The power of C++ can be abused easily, resulting in code that is difficult to understand and maintain. This document gives the practices that should be followed on the AMP project for all new code that is written. The intent is not to be onerous but to ensure that the code can be readily understood by the entire code team and serve as a basis for collectively defining a set of coding standards for use in future development efforts. At the end of the AMP development in fiscal year (FY) 2010, all developers will have experience with the benefits, restrictions, and limitations of the standards described and will collectively define a set of standards for future software development. External libraries that AMP uses do not have to meet these requirements, although we encourage external developers to follow these practices. For any code of which AMP takes ownership, the project will decide on any changes on a case-by-case basis. The practices that we are using in the AMP project have been in use in the Denovo project [2] for several years. The practices build on those given in References [3-5]; the practices given in these references should also be followed. Some of the practices given in this document can also be found in [6].

  14. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  15. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  16. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  17. Autocatalysis, information and coding.

    PubMed

    Wills, P R

    2001-01-01

    Autocatalytic self-construction in macromolecular systems requires the existence of a reflexive relationship between structural components and the functional operations they perform to synthesise themselves. The possibility of reflexivity depends on formal, semiotic features of the catalytic structure-function relationship, that is, the embedding of catalytic functions in the space of polymeric structures. Reflexivity is a semiotic property of some genetic sequences. Such sequences may serve as the basis for the evolution of coding as a result of autocatalytic self-organisation in a population of assignment catalysts. Autocatalytic selection is a mechanism whereby matter becomes differentiated in primitive biochemical systems. In the case of coding self-organisation, it corresponds to the creation of symbolic information. Prions are present-day entities whose replication through autocatalysis reflects aspects of biological semiotics less obvious than genetic coding.

  18. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  19. Polar Code Validation

    DTIC Science & Technology

    1989-09-30

    Unclassified 2a SECURITY CLASSiF-ICATiON AUTHORIT’Y 3 DIStRIBUTION AVAILABILITY OF REPORT N,A Approved for public release; 2o DECLASSIFICAIiON DOWNGRADING SCH DI...SUMMARY OF POLAR ACHIEVEMENTS ..... .......... 3 3 . POLAR CODE PHYSICAL MODELS ..... ............. 5 3.1 PL-ASMA Su ^"ru5 I1LS SH A...11 Structure of the Bipolar Plasma Sheath Generated by SPEAR I ... ...... 1 3 The POLAR Code Wake Model: Comparison with in Situ Observations . . 23

  20. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  1. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  2. Parallelized tree-code for clusters of personal computers

    NASA Astrophysics Data System (ADS)

    Viturro, H. R.; Carpintero, D. D.

    2000-02-01

    We present a tree-code for integrating the equations of the motion of collisionless systems, which has been fully parallelized and adapted to run in several PC-based processors simultaneously, using the well-known PVM message passing library software. SPH algorithms, not yet included, may be easily incorporated to the code. The code is written in ANSI C; it can be freely downloaded from a public ftp site. Simulations of collisions of galaxies are presented, with which the performance of the code is tested.

  3. Code system to compute radiation dose in human phantoms

    SciTech Connect

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods. (LEW)

  4. Sandia National Laboratories analysis code data base

    SciTech Connect

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  5. Criticality Code Validation Exercises with TSUNAMI

    SciTech Connect

    Rearden, Bradley T

    2007-01-01

    In the criticality code validation of common systems, many paths may exist to a correct bias, bias uncertainty, and upper subcritical limit. The challenge for the criticality analyst is to select an efficient, defensible, and safe methodology to consistently obtain the correct values. One method of testing criticality code validation techniques is to use a sample system with a known bias as a test application and determine whether the methods employed can reproduce the known bias. In this paper, a low-enriched uranium (LEU) lattice critical experiment with a known bias is used as the test application, and numerous other LEU experiments are used as the benchmarks for the criticality code validation exercises using traditional and advanced parametric techniques. The parameters explored are enrichment, energy of average lethargy causing fission (EALF), and the TSUNAMI integral index ck with experiments with varying degrees of similarity. This paper is an extension of a previously published summary.

  6. Coding for urologic office procedures.

    PubMed

    Dowling, Robert A; Painter, Mark

    2013-11-01

    This article summarizes current best practices for documenting, coding, and billing common office-based urologic procedures. Topics covered include general principles, basic and advanced urologic coding, creation of medical records that support compliant coding practices, bundled codes and unbundling, global periods, modifiers for procedure codes, when to bill for evaluation and management services during the same visit, coding for supplies, and laboratory and radiology procedures pertinent to urology practice. Detailed information is included for the most common urology office procedures, and suggested resources and references are provided. This information is of value to physicians, office managers, and their coding staff.

  7. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  8. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  9. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  10. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  11. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  12. Video Coding for ESL.

    ERIC Educational Resources Information Center

    King, Kevin

    1992-01-01

    Coding tasks, a valuable technique for teaching English as a Second Language, are presented that enable students to look at patterns and structures of marital communication as well as objectively evaluate the degree of happiness or distress in the marriage. (seven references) (JL)

  13. Electrical Circuit Simulation Code

    SciTech Connect

    Wix, Steven D.; Waters, Arlon J.; Shirley, David

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  14. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  15. Multifractal detrended cross-correlation analysis of coding and non-coding DNA sequences through chaos-game representation

    NASA Astrophysics Data System (ADS)

    Pal, Mayukha; Satish, B.; Srinivas, K.; Rao, P. Madhusudana; Manimaran, P.

    2015-10-01

    We propose a new approach combining the chaos game representation and the two dimensional multifractal detrended cross correlation analysis methods to examine multifractal behavior in power law cross correlation between any pair of nucleotide sequences of unequal lengths. In this work, we analyzed the characteristic behavior of coding and non-coding DNA sequences of eight prokaryotes. The results show the presence of strong multifractal nature between coding and non-coding sequences of all data sets. We found that this integrative approach helps us to consider complete DNA sequences for characterization, and further it may be useful for classification, clustering, identification of class affiliation of nucleotide sequences etc. with high precision.

  16. A review of wind field models for atmospheric transport

    SciTech Connect

    Ramsdell, J.V. Jr.; Skyllingstad, E.D.

    1993-06-01

    The primary objective of the Hanford Environmental Dose Reconstruction (HEDR) Project is to estimate the radiation dose that individuals could have received as a result of emissions since 1944 from the US Department of Energy`s (DOE) Hanford Site near Richland, Washington. The HEDR Project is developing a computer code to estimate these doses and their uncertainties. The code, known as the HEDR integrated Code (HEDRIC), consists of four separate component codes. One of the component codes, called the Regional Atmospheric Transport Code for Hanford Emission Tracking (RATCHET) combines meteorological and release data to estimate time-integrated air concentrations and surface contamination at specific locations in the vicinity of the Hanford Site. The RATCHET domain covers approximately 75,000 square miles, extending from the crest of the Cascade Mountains on the west to the eastern edge of the Idaho panhandle and from central Oregon on the south to the Canadian border. This letter report explains the procedures in RATCHET that transform observed wind data into the wind fields used in atmospheric transport calculations. It also describes and evaluates alternative procedures not selected for use in RATCHET.

  17. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  18. LEGO: A Modular Accelerator Design Code

    NASA Astrophysics Data System (ADS)

    Cai, Y.; Irwin, J.

    1997-05-01

    An object-oriented accelerator design code has been designed and implemented in a simple and modular fashion. It contains all major features of its predecessors TRACY and DESPOT. All physics of single-particle dynamics is implemented based on the Hamiltonian in the local frame of the component. Components can be moved arbitrarily in three dimensional space. Several symplectic integrators are used to approximate the integration of the local Hamiltonians. A differential algebra class is introduced to extract a Taylor map up to an arbitrary order. Analysis of optics is done in the same way for both the linear and non-linear cases. Currently the code is used to design and simulate the lattices of the PEP-II. It will be used for the commissioning of the machines as well.

  19. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  20. A coded modulation design for the INMARSAT geostationary GLONASS augmentation

    NASA Astrophysics Data System (ADS)

    Stein, B.; Tsang, W.

    A cold modulation scheme is proposed to carryout the Global Navigation Satellite System (GLONASS) geostationary augmentation which includes both integrity and navigation functions over the next generation International Maritime Satellite Organization (INMARSAT) satellites. A baseline coded modulation scheme for the GLONASS augmentation broadcast proposes a forward error correction code over a differential phase shift keying (DPSK) modulation. The use of a concatenated code over the same signaling is considered. The proposed coded modulation design is more powerful and robust, yet not overly more complex in system implementation than the baseline scheme. Performance results of concatenated codes over a DPSK signaling used in the design are presented. The sensitivity analysis methodology in selecting the coded modulation scheme is also discussed.

  1. Acceleration of a Monte Carlo radiation transport code

    SciTech Connect

    Hochstedler, R.D.; Smith, L.M.

    1996-03-01

    Execution time for the Integrated TIGER Series (ITS) Monte Carlo radiation transport code has been reduced by careful re-coding of computationally intensive subroutines. Three test cases for the TIGER (1-D slab geometry), CYLTRAN (2-D cylindrical geometry), and ACCEPT (3-D arbitrary geometry) codes were identified and used to benchmark and profile program execution. Based upon these results, sixteen top time-consuming subroutines were examined and nine of them modified to accelerate computations with equivalent numerical output to the original. The results obtained via this study indicate that speedup factors of 1.90 for the TIGER code, 1.67 for the CYLTRAN code, and 1.11 for the ACCEPT code are achievable. {copyright} {ital 1996 American Institute of Physics.}

  2. Frequency-coded quantum key distribution.

    PubMed

    Bloch, Matthieu; McLaughlin, Steven W; Merolla, Jean-Marc; Patois, Frédéric

    2007-02-01

    We report an intrinsically stable quantum key distribution scheme based on genuine frequency-coded quantum states. The qubits are efficiently processed without fiber interferometers by fully exploiting the nonlinear interaction occurring in electro-optic phase modulators. The system requires only integrated off-the-shelf devices and could be used with a true single-photon source. Preliminary experiments have been performed with weak laser pulses and have demonstrated the feasibility of this new setup.

  3. Computing Challenges in Coded Mask Imaging

    NASA Technical Reports Server (NTRS)

    Skinner, Gerald

    2009-01-01

    This slide presaentation reviews the complications and challenges in developing computer systems for Coded Mask Imaging telescopes. The coded mask technique is used when there is no other way to create the telescope, (i.e., when there are wide fields of view, high energies for focusing or low energies for the Compton/Tracker Techniques and very good angular resolution.) The coded mask telescope is described, and the mask is reviewed. The coded Masks for the INTErnational Gamma-Ray Astrophysics Laboratory (INTEGRAL) instruments are shown, and a chart showing the types of position sensitive detectors used for the coded mask telescopes is also reviewed. Slides describe the mechanism of recovering an image from the masked pattern. The correlation with the mask pattern is described. The Matrix approach is reviewed, and other approaches to image reconstruction are described. Included in the presentation is a review of the Energetic X-ray Imaging Survey Telescope (EXIST) / High Energy Telescope (HET), with information about the mission, the operation of the telescope, comparison of the EXIST/HET with the SWIFT/BAT and details of the design of the EXIST/HET.

  4. The metaethics of nursing codes of ethics and conduct.

    PubMed

    Snelling, Paul C

    2016-10-01

    Nursing codes of ethics and conduct are features of professional practice across the world, and in the UK, the regulator has recently consulted on and published a new code. Initially part of a professionalising agenda, nursing codes have recently come to represent a managerialist and disciplinary agenda and nursing can no longer be regarded as a self-regulating profession. This paper argues that codes of ethics and codes of conduct are significantly different in form and function similar to the difference between ethics and law in everyday life. Some codes successfully integrate these two functions within the same document, while others, principally the UK Code, conflate them resulting in an ambiguous document unable to fulfil its functions effectively. The paper analyses the differences between ethical-codes and conduct-codes by discussing titles, authorship, level, scope for disagreement, consequences of transgression, language and finally and possibly most importantly agent-centeredness. It is argued that conduct-codes cannot require nurses to be compassionate because compassion involves an emotional response. The concept of kindness provides a plausible alternative for conduct-codes as it is possible to understand it solely in terms of acts. But if kindness is required in conduct-codes, investigation and possible censure follows from its absence. Using examples it is argued that there are at last five possible accounts of the absence of kindness. As well as being potentially problematic for disciplinary panels, difficulty in understanding the features of blameworthy absence of kindness may challenge UK nurses who, following a recently introduced revalidation procedure, are required to reflect on their practice in relation to The Code. It is concluded that closer attention to metaethical concerns by code writers will better support the functions of their issuing organisations.

  5. FESDIF -- Finite Element Scalar Diffraction theory code

    SciTech Connect

    Kraus, H.G.

    1992-09-01

    This document describes the theory and use of a powerful scalar diffraction theory based computer code for calculation of intensity fields due to diffraction of optical waves by two-dimensional planar apertures and lenses. This code is called FESDIF (Finite Element Scalar Diffraction). It is based upon both Fraunhofer and Kirchhoff scalar diffraction theories. Simplified routines for circular apertures are included. However, the real power of the code comes from its basis in finite element methods. These methods allow the diffracting aperture to be virtually any geometric shape, including the various secondary aperture obstructions present in telescope systems. Aperture functions, with virtually any phase and amplitude variations, are allowed in the aperture openings. Step change aperture functions are accommodated. The incident waves are considered to be monochromatic. Plane waves, spherical waves, or Gaussian laser beams may be incident upon the apertures. Both area and line integral transformations were developed for the finite element based diffraction transformations. There is some loss of aperture function generality in the line integral transformations which are typically many times more computationally efficient than the area integral transformations when applicable to a particular problem.

  6. A comparison of cosmological hydrodynamic codes

    NASA Technical Reports Server (NTRS)

    Kang, Hyesung; Ostriker, Jeremiah P.; Cen, Renyue; Ryu, Dongsu; Hernquist, Lars; Evrard, August E.; Bryan, Greg L.; Norman, Michael L.

    1994-01-01

    We present a detailed comparison of the simulation results of various hydrodynamic codes. Starting with identical initial conditions based on the cold dark matter scenario for the growth of structure, with parameters h = 0.5 Omega = Omega(sub b) = 1, and sigma(sub 8) = 1, we integrate from redshift z = 20 to z = O to determine the physical state within a representative volume of size L(exp 3) where L = 64 h(exp -1) Mpc. Five indenpendent codes are compared: three of them Eulerian mesh-based and two variants of the smooth particle hydrodynamics 'SPH' Lagrangian approach. The Eulerian codes were run at N(exp 3) = (32(exp 3), 64(exp 3), 128(exp 3), and 256(exp 3)) cells, the SPH codes at N(exp 3) = 32(exp 3) and 64(exp 3) particles. Results were then rebinned to a 16(exp 3) grid with the exception that the rebinned data should converge, by all techniques, to a common and correct result as N approaches infinity. We find that global averages of various physical quantities do, as expected, tend to converge in the rebinned model, but that uncertainites in even primitive quantities such as (T), (rho(exp 2))(exp 1/2) persists at the 3%-17% level achieve comparable and satisfactory accuracy for comparable computer time in their treatment of the high-density, high-temeprature regions as measured in the rebinned data; the variance among the five codes (at highest resolution) for the mean temperature (as weighted by rho(exp 2) is only 4.5%. Examined at high resolution we suspect that the density resolution is better in the SPH codes and the thermal accuracy in low-density regions better in the Eulerian codes. In the low-density, low-temperature regions the SPH codes have poor accuracy due to statiscal effects, and the Jameson code gives the temperatures which are too high, due to overuse of artificial viscosity in these high Mach number regions. Overall the comparison allows us to better estimate errors; it points to ways of improving this current generation ofhydrodynamic

  7. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  8. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  9. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  10. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  11. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  12. FRAPCON-3: Integral assessment

    SciTech Connect

    Lanning, D.D.; Berna, G.A.; Berna, G.A.

    1997-12-01

    An integral assessment has been performed for the U.S. Nuclear Regulatory Commission by Pacific Northwest National Laboratory to quantify the predictive capabilities of FRAPCON-3, a steady-state fuel behavior code designed to analyze fuel behavior from beginning-of-life to burnup levels of 65 GWd/MTU. FRAPCON-3 code calculations are shown to compare satisfactorily to a pre-selected set of experimental data with steady-state operating conditions. 30 refs., 27 figs., 18 tabs.

  13. HYCOM Code Development

    DTIC Science & Technology

    2003-02-10

    HYCOM code development Alan J. Wallcraft Naval Research Laboratory 2003 Layered Ocean Model Users’ Workshop February 10, 2003 Report Documentation...unlimited 13. SUPPLEMENTARY NOTES Layered Ocean Modeling Workshop (LOM 2003), Miami, FL, Feb 2003 14. ABSTRACT 15. SUBJECT TERMS 16. SECURITY...Kraus-Turner mixed-layer Æ Energy-Loan (passive) ice model Æ High frequency atmospheric forcing Æ New I/O scheme (.a and .b files) Æ Scalability via

  14. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  15. Trajectory Code Studies, 1987

    SciTech Connect

    Poukey, J.W.

    1988-01-01

    The trajectory code TRAJ has been used extensively to study nonimmersed foilless electron diodes. The basic goal of the research is to design low-emittance injectors for electron linacs and propagation experiments. Systems studied during 1987 include Delphi, Recirc, and Troll. We also discuss a partly successful attempt to extend the same techniques to high currents (tens of kA). 7 refs., 30 figs.

  16. The PHARO Code.

    DTIC Science & Technology

    1981-11-24

    n.cet..ary ad Identfy by block nutrb.) Visible radiation Sensors Infrared radiation Line and band transitions Isophots High altitude nuclear data...radiation (watts sr) in arbitrary wavelength intervals is determined. The results are a series of " isophot " plots for rbitrariiy placed cameras or sensors...Section II. The output of the PHARO code consists of contour plots of radiative intensity (watts/cm ster) or " isophot " plots for arbitrarily placed sensors

  17. The Phantom SPH code

    NASA Astrophysics Data System (ADS)

    Price, Daniel; Wurster, James; Nixon, Chris

    2016-05-01

    I will present the capabilities of the Phantom SPH code for global simulations of dust and gas in protoplanetary discs. I will present our new algorithms for simulating both small and large grains in discs, as well as our progress towards simulating evolving grain populations and coupling with radiation. Finally, I will discuss our recent applications to HL Tau and the physics of dust gap opening.

  18. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  19. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic.

  20. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  1. The Integrated Hazard Analysis Integrator

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Massie, Michael J.

    2009-01-01

    Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and

  2. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  3. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  4. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  5. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  6. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  7. 21 CFR 11.300 - Controls for identification codes/passwords.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Controls for identification codes/passwords. 11... identification codes/passwords. Persons who use electronic signatures based upon use of identification codes in combination with passwords shall employ controls to ensure their security and integrity. Such controls...

  8. Construction of new quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Taneja, Divya; Gupta, Manish; Narula, Rajesh; Bhullar, Jaskaran

    Obtaining quantum maximum distance separable (MDS) codes from dual containing classical constacyclic codes using Hermitian construction have paved a path to undertake the challenges related to such constructions. Using the same technique, some new parameters of quantum MDS codes have been constructed here. One set of parameters obtained in this paper has achieved much larger distance than work done earlier. The remaining constructed parameters of quantum MDS codes have large minimum distance and were not explored yet.

  9. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  10. Information Flow Integrity for Systems of Independently-Developed Components

    DTIC Science & Technology

    2015-06-22

    software (to enforce resource retrieval integrity and fine-grained control-flow integrity of approved code ), and user-space programs (to enforce access...and authorization constraints. Both of these methods enable an efficient deployment of code to enforce expected integrity requirements. This work...confused deputy attacks, code reuse attacks, retrofitting legacy code , web browser, operating system kernels, attack surface, safety games, access

  11. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.

  12. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  13. A class of constacyclic BCH codes and new quantum codes

    NASA Astrophysics Data System (ADS)

    liu, Yang; Li, Ruihu; Lv, Liangdong; Ma, Yuena

    2017-03-01

    Constacyclic BCH codes have been widely studied in the literature and have been used to construct quantum codes in latest years. However, for the class of quantum codes of length n=q^{2m}+1 over F_{q^2} with q an odd prime power, there are only the ones of distance δ ≤ 2q^2 are obtained in the literature. In this paper, by a detailed analysis of properties of q2-ary cyclotomic cosets, maximum designed distance δ _{max} of a class of Hermitian dual-containing constacyclic BCH codes with length n=q^{2m}+1 are determined, this class of constacyclic codes has some characteristic analog to that of primitive BCH codes over F_{q^2}. Then we can obtain a sequence of dual-containing constacyclic codes of designed distances 2q^2<δ ≤ δ _{max}. Consequently, new quantum codes with distance d > 2q^2 can be constructed from these dual-containing codes via Hermitian Construction. These newly obtained quantum codes have better code rate compared with those constructed from primitive BCH codes.

  14. A MCTF video coding scheme based on distributed source coding principles

    NASA Astrophysics Data System (ADS)

    Tagliasacchi, Marco; Tubaro, Stefano

    2005-07-01

    Motion Compensated Temporal Filtering (MCTF) has proved to be an efficient coding tool in the design of open-loop scalable video codecs. In this paper we propose a MCTF video coding scheme based on lifting where the prediction step is implemented using PRISM (Power efficient, Robust, hIgh compression Syndrome-based Multimedia coding), a video coding framework built on distributed source coding principles. We study the effect of integrating the update step at the encoder or at the decoder side. We show that the latter approach allows to improve the quality of the side information exploited during decoding. We present the analytical results obtained by modeling the video signal along the motion trajectories as a first order auto-regressive process. We show that the update step at the decoder allows to half the contribution of the quantization noise. We also include experimental results with real video data that demonstrate the potential of this approach when the video sequences are coded at low bitrates.

  15. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  16. Summary of 1990 Code Conference

    SciTech Connect

    Cooper, R.K.; Chan, Kwok-Chi D.

    1990-01-01

    The Conference on Codes and the Linear Accelerator Community was held in Los Alamos in January 1990, and had approximately 100 participants. This conference was the second in a series which has as its goal the exchange of information about codes and code practices among those writing and actually using these codes for the design and analysis of linear accelerators and their components. The first conference was held in San Diego in January 1988, and concentrated on beam dynamics codes and Maxwell solvers. This most recent conference concentrated on 3-D codes and techniques to handle the large amounts of data required for three-dimensional problems. In addition to descriptions of codes, their algorithms and implementations, there were a number of paper describing the use of many of the codes. Proceedings of both these conferences are available. 3 refs., 2 tabs.

  17. Chemical Laser Computer Code Survey,

    DTIC Science & Technology

    1980-12-01

    DOCUMENTATION: Resonator Geometry Synthesis Code Requi rement NV. L. Gamiz); Incorporate General Resonator into Ray Trace Code (W. H. Southwell... Synthesis Code Development (L. R. Stidhm) CATEGRY ATIUEOPTICS KINETICS GASOYNAM41CS None * None *iNone J.LEVEL Simrple Fabry Perot Simple SaturatedGt... Synthesis Co2de Require- ment (V L. ami l ncor~orate General Resonatorn into Ray Trace Code (W. H. Southwel) Srace Optimization Algorithms and Equations (W

  18. Generating Customized Verifiers for Automatically Generated Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd

    2008-01-01

    Program verification using Hoare-style techniques requires many logical annotations. We have previously developed a generic annotation inference algorithm that weaves in all annotations required to certify safety properties for automatically generated code. It uses patterns to capture generator- and property-specific code idioms and property-specific meta-program fragments to construct the annotations. The algorithm is customized by specifying the code patterns and integrating them with the meta-program fragments for annotation construction. However, this is difficult since it involves tedious and error-prone low-level term manipulations. Here, we describe an annotation schema compiler that largely automates this customization task using generative techniques. It takes a collection of high-level declarative annotation schemas tailored towards a specific code generator and safety property, and generates all customized analysis functions and glue code required for interfacing with the generic algorithm core, thus effectively creating a customized annotation inference algorithm. The compiler raises the level of abstraction and simplifies schema development and maintenance. It also takes care of some more routine aspects of formulating patterns and schemas, in particular handling of irrelevant program fragments and irrelevant variance in the program structure, which reduces the size, complexity, and number of different patterns and annotation schemas that are required. The improvements described here make it easier and faster to customize the system to a new safety property or a new generator, and we demonstrate this by customizing it to certify frame safety of space flight navigation code that was automatically generated from Simulink models by MathWorks' Real-Time Workshop.

  19. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  20. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  1. IRIG Serial Time Code Formats

    DTIC Science & Technology

    2016-08-01

    TELECOMMUNICATIONS AND TIMING GROUP IRIG STANDARD 200-16 IRIG SERIAL TIME CODE FORMATS DISTRIBUTION A: APPROVED FOR...ARNOLD ENGINEERING DEVELOPMENT COMPLEX NATIONAL AERONAUTICS AND SPACE ADMINISTRATION This page intentionally left blank. IRIG SERIAL TIME CODE ...Serial Time Code Formats, RCC 200-16, August 2016 v Table of Contents Preface

  2. Coding Major Fields of Study.

    ERIC Educational Resources Information Center

    Bobbitt, L. G.; Carroll, C. D.

    The National Center for Education Statistics conducts surveys which require the coding of the respondent's major field of study. This paper presents a new system for the coding of major field of study. It operates on-line i a Computer Assisted Telephone Interview (CATI) environment and allows conversational checks to verify coding directly from…

  3. Improved code-tracking loop

    NASA Technical Reports Server (NTRS)

    Laflame, D. T.

    1980-01-01

    Delay-locked loop tracks pseudonoise codes without introducing dc timing errors, because it is not sensitive to gain imbalance between signal processing arms. "Early" and "late" reference codes pass in combined form through both arms, and each arm acts on both codes. Circuit accomodates 1 dB weaker input signals with tracking ability equal to that of tau-dither loops.

  4. Validation of the BEPLATE code

    SciTech Connect

    Giles, G.E.; Bullock, J.S.

    1997-11-01

    The electroforming simulation code BEPLATE (Boundary Element-PLATE) has been developed and validated for specific applications at Oak Ridge. New areas of application are opening up and more validations are being performed. This paper reports the validation experience of the BEPLATE code on two types of electroforms and describes some recent applications of the code.

  5. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  6. Ptolemy Coding Style

    DTIC Science & Technology

    2014-09-05

    COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Ptolemy Coding Style 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...lisp module for GNU Emacs that has appropriate indenting rules. This file works well with Emacs under both Unix and Windows. • testsuite/ptspell is a...Unix. It is much more liberal that the commonly used “GPL” or “ GNU Public License,” which encumbers the software and derivative works with the

  7. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  8. Structured error recovery for code-word-stabilized quantum codes

    NASA Astrophysics Data System (ADS)

    Li, Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-01

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3t times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  9. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  10. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-10-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  11. Quantum Codes From Cyclic Codes Over The Ring R2

    NASA Astrophysics Data System (ADS)

    Altinel, Alev; Güzeltepe, Murat

    2016-10-01

    Let R 2 denotes the ring F 2 + μF 2 + υ2 + μυF 2 + wF 2 + μwF 2 + υwF 2 + μυwF2. In this study, we construct quantum codes from cyclic codes over the ring R2, for arbitrary length n, with the restrictions μ2 = 0, υ2 = 0, w 2 = 0, μυ = υμ, μw = wμ, υw = wυ and μ (υw) = (μυ) w. Also, we give a necessary and sufficient condition for cyclic codes over R2 that contains its dual. As a final point, we obtain the parameters of quantum error-correcting codes from cyclic codes over R2 and we give an example of quantum error-correcting codes form cyclic codes over R 2.

  12. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  13. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  14. FAST GYROSYNCHROTRON CODES

    SciTech Connect

    Fleishman, Gregory D.; Kuznetsov, Alexey A.

    2010-10-01

    Radiation produced by charged particles gyrating in a magnetic field is highly significant in the astrophysics context. Persistently increasing resolution of astrophysical observations calls for corresponding three-dimensional modeling of the radiation. However, available exact equations are prohibitively slow in computing a comprehensive table of high-resolution models required for many practical applications. To remedy this situation, we develop approximate gyrosynchrotron (GS) codes capable of quickly calculating the GS emission (in non-quantum regime) from both isotropic and anisotropic electron distributions in non-relativistic, mildly relativistic, and ultrarelativistic energy domains applicable throughout a broad range of source parameters including dense or tenuous plasmas and weak or strong magnetic fields. The computation time is reduced by several orders of magnitude compared with the exact GS algorithm. The new algorithm performance can gradually be adjusted to the user's needs depending on whether precision or computation speed is to be optimized for a given model. The codes are made available for users as a supplement to this paper.

  15. Determinate-state convolutional codes

    NASA Technical Reports Server (NTRS)

    Collins, O.; Hizlan, M.

    1991-01-01

    A determinate state convolutional code is formed from a conventional convolutional code by pruning away some of the possible state transitions in the decoding trellis. The type of staged power transfer used in determinate state convolutional codes proves to be an extremely efficient way of enhancing the performance of a concatenated coding system. The decoder complexity is analyzed along with free distances of these new codes and extensive simulation results is provided of their performance at the low signal to noise ratios where a real communication system would operate. Concise, practical examples are provided.

  16. Integrated Analysis of Long Non-coding RNAs (LncRNAs) and mRNA Expression Profiles Reveals the Potential Role of LncRNAs in Skeletal Muscle Development of the Chicken

    PubMed Central

    Li, Zhenhui; Ouyang, Hongjia; Zheng, Ming; Cai, Bolin; Han, Peigong; Abdalla, Bahareldin A.; Nie, Qinghua; Zhang, Xiquan

    2017-01-01

    Long non-coding RNAs (lncRNAs) play important roles in transcriptional and post-transcriptional regulation. However, little is currently known about the mechanisms by which they regulate skeletal muscle development in the chicken. In this study, we used RNA sequencing to profile the leg muscle transcriptome (lncRNA and mRNA) at three stages of skeletal muscle development in the chicken: embryonic day 11 (E11), embryonic day 16 (E16), and 1 day after hatching (D1). In total, 129, 132, and 45 differentially expressed lncRNAs, and 1798, 3072, and 1211 differentially expressed mRNAs were identified in comparisons of E11 vs. E16, E11 vs. D1, and E16 vs. D1, respectively. Moreover, we identified the cis- and trans-regulatory target genes of differentially expressed lncRNAs, and constructed lncRNA-gene interaction networks. In total, 126 and 200 cis-targets, and two and three trans-targets were involved in lncRNA-gene interaction networks that were constructed based on the E11 vs. E16, and E11 vs. D1 comparisons, respectively. The comparison of the E16 vs. D1 lncRNA-gene network comprised 25 cis-targets. We determined that lncRNA target genes are potentially involved in cellular development, and cellular growth and proliferation using Ingenuity Pathway Analysis. The gene networks identified for the E11 vs. D1 comparison were involved in embryonic development, organismal development and tissue development. The present study provides an RNA sequencing based evaluation of lncRNA function during skeletal muscle development in the chicken. Comprehensive analysis facilitated the identification of lncRNAs and target genes that might contribute to the regulation of different stages of skeletal muscle development. PMID:28119630

  17. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes.

  18. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  19. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  20. The Application of the PEBBED Code Suite to the PBMR-400 Coupled Code Benchmark - FY 2006 Annual Report

    SciTech Connect

    Not Available

    2006-09-01

    This document describes the recent developments of the PEBBED code suite and its application to the PBMR-400 Coupled Code Benchmark. This report addresses an FY2006 Level 2 milestone under the NGNP Design and Evaluation Methods Work Package. The milestone states "Complete a report describing the results of the application of the integrated PEBBED code package to the PBMR-400 coupled code benchmark". The report describes the current state of the PEBBED code suite, provides an overview of the Benchmark problems to which it was applied, discusses the code developments achieved in the past year, and states some of the results attained. Results of the steady state problems generated by the PEBBED fuel management code compare favorably to the preliminary results generated by codes from other participating institutions and to similar non-Benchmark analyses. Partial transient analysis capability has been achieved through the acquisition of the NEM-THERMIX code from Penn State University. Phase I of the task has been achieved through the development of a self-consistent set of tools for generating cross sections for design and transient analysis and in the successful execution of the steady state benchmark exercises.

  1. Color-Coded Organelles.

    ERIC Educational Resources Information Center

    McLaughlin, Esther; And Others

    1994-01-01

    Describes how red beets can be used to demonstrate a variety of membrane phenomena. Some of the activities include observation of vacuoles; vacuoles in intact cells; isolation of vacuoles in physiological studies; demonstration of membrane integrity; and demonstration of ion diffusion and active transport with purified vacuoles. (ZWH)

  2. A Simple Tight Bound on Error Probability of Block Codes with Application to Turbo Codes

    NASA Astrophysics Data System (ADS)

    Divsalar, D.

    1999-07-01

    A simple bound on the probability of decoding error for block codes is derived in closed form. This bound is based on the bounding techniques developed by Gallager. We obtained an upper bound both on the word-error probability and the bit-error probability of block codes. The bound is simple, since it does not require any integration or optimization in its final version. The bound is tight since it works for signal-to-noise ratios (SNRs) very close to the Shannon capacity limit. The bound uses only the weight distribution of the code. The bound for nonrandom codes is tighter than the original Gallager bound and its new versions derived by Sason and Shamai and by Viterbi and Viterbi. It also is tighter than the recent simpler bound by Viterbi and Viterbi and simpler than the bound by Duman and Salehi, which requires two-parameter optimization. For long blocks, it competes well with more complex bounds that involve integration and parameter optimization, such as the tangential sphere bound by Poltyrev, elaborated by Sason and Shamai, and investigated by Viterbi and Viterbi, and the geometry bound by Dolinar, Ekroot, and Pollara. We also obtained a closed-form expression for the minimum SNR threshold that can serve as a tight upper bound on maximum-likelihood capacity of nonrandom codes. We also have shown that this minimum SNR threshold of our bound is the same as for the tangential sphere bound of Poltyrev. We applied this simple bound to turbo-like codes.

  3. Telescope Adaptive Optics Code

    SciTech Connect

    Phillion, D.

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The default parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST

  4. Code lock with microcircuit

    NASA Astrophysics Data System (ADS)

    Korobka, A.; May, I.

    1985-01-01

    A code lock with a microcircuit was invented which contains only a very few components. Two DD-triggers control the state of two identical transistors. When both transistors are turned on simultaneously the transistor VS1 is turned on so that the electromagnet YA1 pulls in the bolt and the door opens. This will happen only when a logic 1 appears at the inverted output of the first trigger and at the straight output of the second one. After the door is opened, a button on it resets the contactors to return both triggers to their original state. The electromagnetic is designed to produce the necessary pull force and sufficient power when under rectified 127 V line voltage, with the neutral wire of the lock circuit always connected to the - terminal of the power supply.

  5. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  6. Granada oscillation code (GraCo)

    NASA Astrophysics Data System (ADS)

    Moya, A.; Garrido, R.

    2008-08-01

    Granada oscillation code (GraCo) is a software constructed to compute adiabatic and non-adiabatic oscillation eigenfunctions and eigenvalues. The adiabatic version gives the standard numerical resolution, and also the Richardson extrapolation, different sets of eigenfunctions, different outer mechanical boundary conditions or different integration variables. The non-adiabatic version can include the atmosphere-pulsation interaction. The code has been used for intensive studies of δ Scuti, γ Doradus, β Ceph., SdO and, SdB stars. The non adiabatic observables “phase-lag” (the phase between the effective temperature variations and the radial displacement) and {δ T_{eff}over T_{eff}} (relative surface temperature variation) can help to the modal identification. These quantities together with the energy balance (“growth rate”) provide useful additional information to the adiabatic resolution (eigenfrequencies and eigenfunctions).

  7. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  8. Improved lossless intra coding for next generation video coding

    NASA Astrophysics Data System (ADS)

    Vanam, Rahul; He, Yuwen; Ye, Yan

    2016-09-01

    Recently, there have been efforts by the ITU-T VCEG and ISO/IEC MPEG to further improve the compression performance of the High Efficiency Video Coding (HEVC) standard for developing a potential next generation video coding standard. The exploratory codec software of this potential standard includes new coding tools for inter and intra coding. In this paper, we present a new intra prediction mode for lossless intra coding. Our new intra mode derives a prediction filter for each input pixel using its neighboring reconstructed pixels, and applies this filter to the nearest neighboring reconstructed pixels to generate a prediction pixel. The proposed intra mode is demonstrated to improve the performance of the exploratory software for lossless intra coding, yielding a maximum and average bitrate savings of 4.4% and 2.11%, respectively.

  9. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  10. An information theoretic approach to use high-fidelity codes to calibrate low-fidelity codes

    NASA Astrophysics Data System (ADS)

    Lewis, Allison; Smith, Ralph; Williams, Brian; Figueroa, Victor

    2016-11-01

    For many simulation models, it can be prohibitively expensive or physically infeasible to obtain a complete set of experimental data to calibrate model parameters. In such cases, one can alternatively employ validated higher-fidelity codes to generate simulated data, which can be used to calibrate the lower-fidelity code. In this paper, we employ an information-theoretic framework to determine the reduction in parameter uncertainty that is obtained by evaluating the high-fidelity code at a specific set of design conditions. These conditions are chosen sequentially, based on the amount of information that they contribute to the low-fidelity model parameters. The goal is to employ Bayesian experimental design techniques to minimize the number of high-fidelity code evaluations required to accurately calibrate the low-fidelity model. We illustrate the performance of this framework using heat and diffusion examples, a 1-D kinetic neutron diffusion equation, and a particle transport model, and include initial results from the integration of the high-fidelity thermal-hydraulics code Hydra-TH with a low-fidelity exponential model for the friction correlation factor.

  11. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    SciTech Connect

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  12. Status report on the 'Merging' of the Electron-Cloud Code POSINST with the 3-D Accelerator PIC CODE WARP

    SciTech Connect

    Vay, J.-L.; Furman, M.A.; Azevedo, A.W.; Cohen, R.H.; Friedman, A.; Grote, D.P.; Stoltz, P.H.

    2004-04-19

    We have integrated the electron-cloud code POSINST [1] with WARP [2]--a 3-D parallel Particle-In-Cell accelerator code developed for Heavy Ion Inertial Fusion--so that the two can interoperate. Both codes are run in the same process, communicate through a Python interpreter (already used in WARP), and share certain key arrays (so far, particle positions and velocities). Currently, POSINST provides primary and secondary sources of electrons, beam bunch kicks, a particle mover, and diagnostics. WARP provides the field solvers and diagnostics. Secondary emission routines are provided by the Tech-X package CMEE.

  13. UUGM Code Development

    DTIC Science & Technology

    1993-07-26

    flow (very low speed and subsonic flows, transonic flow, supersonic and hypersonic flows), 2) advances in unstructured adaptive gridding techniques ...realistically simulated by CFD techniques . SAIC has been involved in all aspects of these developments and is on the forefront of CFD technology...differential equations of gasdynamnics. Presently, as a result of steady improvement ii the various integration techniques , the advantages which could be gained

  14. Fault-Tolerant Coding for State Machines

    NASA Technical Reports Server (NTRS)

    Naegle, Stephanie Taft; Burke, Gary; Newell, Michael

    2008-01-01

    Two reliable fault-tolerant coding schemes have been proposed for state machines that are used in field-programmable gate arrays and application-specific integrated circuits to implement sequential logic functions. The schemes apply to strings of bits in state registers, which are typically implemented in practice as assemblies of flip-flop circuits. If a single-event upset (SEU, a radiation-induced change in the bit in one flip-flop) occurs in a state register, the state machine that contains the register could go into an erroneous state or could hang, by which is meant that the machine could remain in undefined states indefinitely. The proposed fault-tolerant coding schemes are intended to prevent the state machine from going into an erroneous or hang state when an SEU occurs. To ensure reliability of the state machine, the coding scheme for bits in the state register must satisfy the following criteria: 1. All possible states are defined. 2. An SEU brings the state machine to a known state. 3. There is no possibility of a hang state. 4. No false state is entered. 5. An SEU exerts no effect on the state machine. Fault-tolerant coding schemes that have been commonly used include binary encoding and "one-hot" encoding. Binary encoding is the simplest state machine encoding and satisfies criteria 1 through 3 if all possible states are defined. Binary encoding is a binary count of the state machine number in sequence; the table represents an eight-state example. In one-hot encoding, N bits are used to represent N states: All except one of the bits in a string are 0, and the position of the 1 in the string represents the state. With proper circuit design, one-hot encoding can satisfy criteria 1 through 4. Unfortunately, the requirement to use N bits to represent N states makes one-hot coding inefficient.

  15. New developments in the Saphire computer codes

    SciTech Connect

    Russell, K.D.; Wood, S.T.; Kvarfordt, K.J.

    1996-03-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) refers to a suite of computer programs that were developed to create and analyze a probabilistic risk assessment (PRA) of a nuclear power plant. Many recent enhancements to this suite of codes have been made. This presentation will provide an overview of these features and capabilities. The presentation will include a discussion of the new GEM module. This module greatly reduces and simplifies the work necessary to use the SAPHIRE code in event assessment applications. An overview of the features provided in the new Windows version will also be provided. This version is a full Windows 32-bit implementation and offers many new and exciting features. [A separate computer demonstration was held to allow interested participants to get a preview of these features.] The new capabilities that have been added since version 5.0 will be covered. Some of these major new features include the ability to store an unlimited number of basic events, gates, systems, sequences, etc.; the addition of improved reporting capabilities to allow the user to generate and {open_quotes}scroll{close_quotes} through custom reports; the addition of multi-variable importance measures; and the simplification of the user interface. Although originally designed as a PRA Level 1 suite of codes, capabilities have recently been added to SAPHIRE to allow the user to apply the code in Level 2 analyses. These features will be discussed in detail during the presentation. The modifications and capabilities added to this version of SAPHIRE significantly extend the code in many important areas. Together, these extensions represent a major step forward in PC-based risk analysis tools. This presentation provides a current up-to-date status of these important PRA analysis tools.

  16. On multilevel block modulation codes

    NASA Technical Reports Server (NTRS)

    Kasami, Tadao; Takata, Toyoo; Fujiwara, Toru; Lin, Shu

    1991-01-01

    The multilevel (ML) technique for combining block coding and modulation is investigated. A general formulation is presented for ML modulation codes in terms of component codes with appropriate distance measures. A specific method for constructing ML block modulation codes (MLBMCs) with interdependency among component codes is proposed. Given an MLBMC C with no interdependency among the binary component codes, the proposed method gives an MLBC C-prime that has the same rate as C, a minimum squared Euclidean distance not less than that of C, a trellis diagram with the same number of states as that of C, and a smaller number of nearest-neighbor codewords than that of C. Finally, a technique is presented for analyzing the error performance of MLBMCs for an additive white Gaussian noise channel based on soft-decision maximum-likelihood decoding.

  17. Multiplexed temporal coding of electric communication signals in mormyrid fishes

    PubMed Central

    Baker, Christa A.; Kohashi, Tsunehiko; Lyons-Warren, Ariel M.; Ma, Xiaofeng; Carlson, Bruce A.

    2013-01-01

    Summary The coding of stimulus information into patterns of spike times occurs widely in sensory systems. Determining how temporally coded information is decoded by central neurons is essential to understanding how brains process sensory stimuli. Mormyrid weakly electric fishes are experts at time coding, making them an exemplary organism for addressing this question. Mormyrids generate brief, stereotyped electric pulses. Pulse waveform carries information about sender identity, and it is encoded into submillisecond-to-millisecond differences in spike timing between receptors. Mormyrids vary the time between pulses to communicate behavioral state, and these intervals are encoded into the sequence of interspike intervals within receptors. Thus, the responses of peripheral electroreceptors establish a temporally multiplexed code for communication signals, one consisting of spike timing differences between receptors and a second consisting of interspike intervals within receptors. These signals are processed in a dedicated sensory pathway, and recent studies have shed light on the mechanisms by which central circuits can extract behaviorally relevant information from multiplexed temporal codes. Evolutionary change in the anatomy of this pathway is related to differences in electrosensory perception, which appears to have influenced the diversification of electric signals and species. However, it remains unknown how this evolutionary change relates to differences in sensory coding schemes, neuronal circuitry and central sensory processing. The mormyrid electric communication pathway is a powerful model for integrating mechanistic studies of temporal coding with evolutionary studies of correlated differences in brain and behavior to investigate neural mechanisms for processing temporal codes. PMID:23761462

  18. Multiplexed temporal coding of electric communication signals in mormyrid fishes.

    PubMed

    Baker, Christa A; Kohashi, Tsunehiko; Lyons-Warren, Ariel M; Ma, Xiaofeng; Carlson, Bruce A

    2013-07-01

    The coding of stimulus information into patterns of spike times occurs widely in sensory systems. Determining how temporally coded information is decoded by central neurons is essential to understanding how brains process sensory stimuli. Mormyrid weakly electric fishes are experts at time coding, making them an exemplary organism for addressing this question. Mormyrids generate brief, stereotyped electric pulses. Pulse waveform carries information about sender identity, and it is encoded into submillisecond-to-millisecond differences in spike timing between receptors. Mormyrids vary the time between pulses to communicate behavioral state, and these intervals are encoded into the sequence of interspike intervals within receptors. Thus, the responses of peripheral electroreceptors establish a temporally multiplexed code for communication signals, one consisting of spike timing differences between receptors and a second consisting of interspike intervals within receptors. These signals are processed in a dedicated sensory pathway, and recent studies have shed light on the mechanisms by which central circuits can extract behaviorally relevant information from multiplexed temporal codes. Evolutionary change in the anatomy of this pathway is related to differences in electrosensory perception, which appears to have influenced the diversification of electric signals and species. However, it remains unknown how this evolutionary change relates to differences in sensory coding schemes, neuronal circuitry and central sensory processing. The mormyrid electric communication pathway is a powerful model for integrating mechanistic studies of temporal coding with evolutionary studies of correlated differences in brain and behavior to investigate neural mechanisms for processing temporal codes.

  19. QR code for medical information uses.

    PubMed

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-11-06

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine.

  20. Code Speed Measuring for VC++

    DTIC Science & Technology

    2015-10-01

    UNCLASSIFIED AD-E403 688 Technical Report ARWSE-TR-14025 CODE SPEED MEASURING FOR VC++ Tom Nealis...TYPE Final 3. DATES COVERED (From – To) 4. TITLE AND SUBTITLE CODE SPEED MEASURING FOR VC++ 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c...ABSTRACT It’s often important to know how fast a snippet of code executes. This information allows the coder to make important decisions

  1. Explosive Formulation Code Naming SOP

    SciTech Connect

    Martz, H. E.

    2014-09-19

    The purpose of this SOP is to provide a procedure for giving individual HME formulations code names. A code name for an individual HME formulation consists of an explosive family code, given by the classified guide, followed by a dash, -, and a number. If the formulation requires preparation such as packing or aging, these add additional groups of symbols to the X-ray specimen name.

  2. Bar-Code-Scribing Tool

    NASA Technical Reports Server (NTRS)

    Badinger, Michael A.; Drouant, George J.

    1991-01-01

    Proposed hand-held tool applies indelible bar code to small parts. Possible to identify parts for management of inventory without tags or labels. Microprocessor supplies bar-code data to impact-printer-like device. Device drives replaceable scribe, which cuts bar code on surface of part. Used to mark serially controlled parts for military and aerospace equipment. Also adapts for discrete marking of bulk items used in food and pharmaceutical processing.

  3. Slimplectic Integrators: Variational Integrators for Nonconservative systems

    NASA Astrophysics Data System (ADS)

    Tsang, David

    2016-05-01

    Symplectic integrators are widely used for long-term integration of conservative astrophysical problems due to their ability to preserve the constants of motion; however, they cannot in general be applied in the presence of nonconservative interactions. Here we present the “slimplectic” integrator, a new type of numerical integrator that shares many of the benefits of traditional symplectic integrators yet is applicable to general nonconservative systems. We utilize a fixed-time-step variational integrator formalism applied to a newly developed principle of stationary nonconservative action (Galley, 2013, Galley et al 2014). As a result, the generalized momenta and energy (Noether current) evolutions are well-tracked. We discuss several example systems, including damped harmonic oscillators, Poynting-Robertson drag, and gravitational radiation reaction, by utilizing our new publicly available code to demonstrate the slimplectic integrator algorithm. Slimplectic integrators are well-suited for integrations of systems where nonconservative effects play an important role in the long-term dynamical evolution. As such they are particularly appropriate for cosmological or celestial N-body dynamics problems where nonconservative interactions, e.g., gas interactions or dissipative tides, can play an important role.

  4. Upgrades to NRLMOL code

    NASA Astrophysics Data System (ADS)

    Basurto, Luis

    This project consists of performing upgrades to the massively parallel NRLMOL electronic structure code in order to enhance its performance by increasing its flexibility by: a) Utilizing dynamically allocated arrays, b) Executing in a parallel environment sections of the program that were previously executed in a serial mode, c) Exploring simultaneous concurrent executions of the program through the use of an already existing MPI environment; thus enabling the simulation of larger systems than it is currently capable of performing. Also developed was a graphical user interface that will allow less experienced users to start performing electronic structure calculations by aiding them in performing the necessary configuration of input files as well as providing graphical tools for the displaying and analysis of results. Additionally, a computational toolkit that can avail of large supercomputers and make use of various levels of approximation for atomic interactions was developed to search for stable atomic clusters and predict novel stable endohedral fullerenes. As an application of the developed computational toolkit, a search was conducted for stable isomers of Sc3N C80 fullerene. In this search, about 1.2 million isomers of C80 were optimized in various charged states at the PM6 level. Subsequently, using the selected optimized isomers of C80 in various charged state, about 10,000 isomers of Sc3N C80 were constructed which were optimized using semi-empirical PM6 quantum chemical method. A few selected lowest isomers of Sc3N C80 were optimized at the DFT level. The calculation confirms the lowest 3 isomers previously reported in literature but 4 new isomers are found within the lowest 10 isomers. Using the upgraded NRLMOL code, a study was done of the electronic structure of a multichromoric molecular complex containing two of each borondipyrromethane dye, Zn-tetraphenyl-porphyrin, bisphenyl anthracene and a fullerene. A systematic examination of the effect of

  5. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  6. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  7. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  8. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6 x 4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (53,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  9. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6x4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (63,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  10. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  11. Adaptive Dynamic Event Tree in RAVEN code

    SciTech Connect

    Alfonsi, Andrea; Rabiti, Cristian; Mandelli, Diego; Cogliati, Joshua Joseph; Kinoshita, Robert Arthur

    2014-11-01

    RAVEN is a software tool that is focused on performing statistical analysis of stochastic dynamic systems. RAVEN has been designed in a high modular and pluggable way in order to enable easy integration of different programming languages (i.e., C++, Python) and coupling with other applications (system codes). Among the several capabilities currently present in RAVEN, there are five different sampling strategies: Monte Carlo, Latin Hyper Cube, Grid, Adaptive and Dynamic Event Tree (DET) sampling methodologies. The scope of this paper is to present a new sampling approach, currently under definition and implementation: an evolution of the DET me

  12. The KIDTALK Behavior and Language Code: Manual and Coding Protocol.

    ERIC Educational Resources Information Center

    Delaney, Elizabeth M.; Ezell, Sara S.; Solomon, Ned A.; Hancock, Terry B.; Kaiser, Ann P.

    Developed as part of the Milieu Language Teaching Project at the John F. Kennedy Center at Vanderbilt University in Nashville, Tennessee, this KIDTALK Behavior-Language Coding Protocol and manual measures behavior occurring during adult-child interactions. The manual is divided into 5 distinct sections: (1) the adult behavior codes describe…

  13. Patched Conic Trajectory Code

    NASA Technical Reports Server (NTRS)

    Park, Brooke Anderson; Wright, Henry

    2012-01-01

    PatCon code was developed to help mission designers run trade studies on launch and arrival times for any given planet. Initially developed in Fortran, the required inputs included launch date, arrival date, and other orbital parameters of the launch planet and arrival planets at the given dates. These parameters include the position of the planets, the eccentricity, semi-major axes, argument of periapsis, ascending node, and inclination of the planets. With these inputs, a patched conic approximation is used to determine the trajectory. The patched conic approximation divides the planetary mission into three parts: (1) the departure phase, in which the two relevant bodies are Earth and the spacecraft, and where the trajectory is a departure hyperbola with Earth at the focus; (2) the cruise phase, in which the two bodies are the Sun and the spacecraft, and where the trajectory is a transfer ellipse with the Sun at the focus; and (3) the arrival phase, in which the two bodies are the target planet and the spacecraft, where the trajectory is an arrival hyperbola with the planet as the focus.

  14. Civil Code, 11 December 1987.

    PubMed

    1988-01-01

    Article 162 of this Mexican Code provides, among other things, that "Every person has the right freely, responsibly, and in an informed fashion to determine the number and spacing of his or her children." When a marriage is involved, this right is to be observed by the spouses "in agreement with each other." The civil codes of the following states contain the same provisions: 1) Baja California (Art. 159 of the Civil Code of 28 April 1972 as revised in Decree No. 167 of 31 January 1974); 2) Morelos (Art. 255 of the Civil Code of 26 September 1949 as revised in Decree No. 135 of 29 December 1981); 3) Queretaro (Art. 162 of the Civil Code of 29 December 1950 as revised in the Act of 9 January 1981); 4) San Luis Potosi (Art. 147 of the Civil Code of 24 March 1946 as revised in 13 June 1978); Sinaloa (Art. 162 of the Civil Code of 18 June 1940 as revised in Decree No. 28 of 14 October 1975); 5) Tamaulipas (Art. 146 of the Civil Code of 21 November 1960 as revised in Decree No. 20 of 30 April 1975); 6) Veracruz-Llave (Art. 98 of the Civil Code of 1 September 1932 as revised in the Act of 30 December 1975); and 7) Zacatecas (Art. 253 of the Civil Code of 9 February 1965 as revised in Decree No. 104 of 13 August 1975). The Civil Codes of Puebla and Tlaxcala provide for this right only in the context of marriage with the spouses in agreement. See Art. 317 of the Civil Code of Puebla of 15 April 1985 and Article 52 of the Civil Code of Tlaxcala of 31 August 1976 as revised in Decree No. 23 of 2 April 1984. The Family Code of Hidalgo requires as a formality of marriage a certification that the spouses are aware of methods of controlling fertility, responsible parenthood, and family planning. In addition, Article 22 the Civil Code of the Federal District provides that the legal capacity of natural persons is acquired at birth and lost at death; however, from the moment of conception the individual comes under the protection of the law, which is valid with respect to the

  15. Methodology, status and plans for development and assessment of Cathare code

    SciTech Connect

    Bestion, D.; Barre, F.; Faydide, B.

    1997-07-01

    This paper presents the methodology, status and plans for the development, assessment and uncertainty evaluation of the Cathare code. Cathare is a thermalhydraulic code developed by CEA (DRN), IPSN, EDF and FRAMATOME for PWR safety analysis. First, the status of the code development and assessment is presented. The general strategy used for the development and the assessment of the code is presented. Analytical experiments with separate effect tests, and component tests are used for the development and the validation of closure laws. Successive Revisions of constitutive laws are implemented in successive Versions of the code and assessed. System tests or integral tests are used to validate the general consistency of the Revision. Each delivery of a code Version + Revision is fully assessed and documented. A methodology is being developed to determine the uncertainty on all constitutive laws of the code using calculations of many analytical tests and applying the Discrete Adjoint Sensitivity Method (DASM). At last, the plans for the future developments of the code are presented. They concern the optimization of the code performance through parallel computing - the code will be used for real time full scope plant simulators - the coupling with many other codes (neutronic codes, severe accident codes), the application of the code for containment thermalhydraulics. Also, physical improvements are required in the field of low pressure transients and in the modeling for the 3-D model.

  16. Coded excitation ultrasonic needle tracking: An in vivo study

    PubMed Central

    Xia, Wenfeng; Ginsberg, Yuval; West, Simeon J.; Nikitichev, Daniil I.; Ourselin, Sebastien; David, Anna L.; Desjardins, Adrien E.

    2016-01-01

    Purpose Accurate and efficient guidance of medical devices to procedural targets lies at the heart of interventional procedures. Ultrasound imaging is commonly used for device guidance, but determining the location of the device tip can be challenging. Various methods have been proposed to track medical devices during ultrasound-guided procedures, but widespread clinical adoption has remained elusive. With ultrasonic tracking, the location of a medical device is determined by ultrasonic communication between the ultrasound imaging probe and a transducer integrated into the medical device. The signal-to-noise ratio (SNR) of the transducer data is an important determinant of the depth in tissue at which tracking can be performed. In this paper, the authors present a new generation of ultrasonic tracking in which coded excitation is used to improve the SNR without spatial averaging. Methods A fiber optic hydrophone was integrated into the cannula of a 20 gauge insertion needle. This transducer received transmissions from the ultrasound imaging probe, and the data were processed to obtain a tracking image of the needle tip. Excitation using Barker or Golay codes was performed to improve the SNR, and conventional bipolar excitation was performed for comparison. The performance of the coded excitation ultrasonic tracking system was evaluated in an in vivo ovine model with insertions to the brachial plexus and the uterine cavity. Results Coded excitation significantly increased the SNRs of the tracking images, as compared with bipolar excitation. During an insertion to the brachial plexus, the SNR was increased by factors of 3.5 for Barker coding and 7.1 for Golay coding. During insertions into the uterine cavity, these factors ranged from 2.9 to 4.2 for Barker coding and 5.4 to 8.5 for Golay coding. The maximum SNR was 670, which was obtained with Golay coding during needle withdrawal from the brachial plexus. Range sidelobe artifacts were observed in tracking images

  17. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control.

  18. CAM - Geometric systems integration

    NASA Astrophysics Data System (ADS)

    Dunlap, G. C.

    The integration of geometric and nongeometric information for efficient use of CAM is examined. Requirements for engineering drawings requested by management are noted to involve large volumes of nongeometric data to define the materials and quantity variables which impinge on the required design, so that the actual design may be the last and smaller step in the CAM process. Geometric classification and coding are noted to offer an alpha/numeric identifier for integrating the engineering design, manufacturing, and quality assurance functions. An example is provided of a turbine gear part coding in terms of polycode and monocode displays, showing a possible covering of more than 10 trillion features. Software is stressed as the key to integration of company-wide data.

  19. Cracking the bioelectric code

    PubMed Central

    Tseng, AiSun; Levin, Michael

    2013-01-01

    Patterns of resting potential in non-excitable cells of living tissue are now known to be instructive signals for pattern formation during embryogenesis, regeneration and cancer suppression. The development of molecular-level techniques for tracking ion flows and functionally manipulating the activity of ion channels and pumps has begun to reveal the mechanisms by which voltage gradients regulate cell behaviors and the assembly of complex large-scale structures. A recent paper demonstrated that a specific voltage range is necessary for demarcation of eye fields in the frog embryo. Remarkably, artificially setting other somatic cells to the eye-specific voltage range resulted in formation of eyes in aberrant locations, including tissues that are not in the normal anterior ectoderm lineage: eyes could be formed in the gut, on the tail, or in the lateral plate mesoderm. These data challenge the existing models of eye fate restriction and tissue competence maps, and suggest the presence of a bioelectric code—a mapping of physiological properties to anatomical outcomes. This Addendum summarizes the current state of knowledge in developmental bioelectricity, proposes three possible interpretations of the bioelectric code that functionally maps physiological states to anatomical outcomes, and highlights the biggest open questions in this field. We also suggest a speculative hypothesis at the intersection of cognitive science and developmental biology: that bioelectrical signaling among non-excitable cells coupled by gap junctions simulates neural network-like dynamics, and underlies the information processing functions required by complex pattern formation in vivo. Understanding and learning to control the information stored in physiological networks will have transformative implications for developmental biology, regenerative medicine and synthetic bioengineering. PMID:23802040

  20. Breaking the Code of Silence.

    ERIC Educational Resources Information Center

    Halbig, Wolfgang W.

    2000-01-01

    Schools and communities must break the adolescent code of silence concerning threats of violence. Schools need character education stressing courage, caring, and responsibility; regular discussions of the school discipline code; formal security discussions with parents; 24-hour hotlines; and protocols for handling reports of potential violence.…

  1. ACCELERATION PHYSICS CODE WEB REPOSITORY.

    SciTech Connect

    WEI, J.

    2006-06-26

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  2. Accelerator Physics Code Web Repository

    SciTech Connect

    Zimmermann, F.; Basset, R.; Bellodi, G.; Benedetto, E.; Dorda, U.; Giovannozzi, M.; Papaphilippou, Y.; Pieloni, T.; Ruggiero, F.; Rumolo, G.; Schmidt, F.; Todesco, E.; Zotter, B.W.; Payet, J.; Bartolini, R.; Farvacque, L.; Sen, T.; Chin, Y.H.; Ohmi, K.; Oide, K.; Furman, M.; /LBL, Berkeley /Oak Ridge /Pohang Accelerator Lab. /SLAC /TRIUMF /Tech-X, Boulder /UC, San Diego /Darmstadt, GSI /Rutherford /Brookhaven

    2006-10-24

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  3. LFSC - Linac Feedback Simulation Code

    SciTech Connect

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  4. Indices for Testing Neural Codes

    PubMed Central

    Victor, Jonathan D.; Nirenberg, Sheila

    2009-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is smaller than the latter (i.e., if the code cannot account for the behavior), the code can be ruled out. The information-theoretic index most widely used in this context is Shannon’s mutual information. The Shannon test, however, is not ideal for this purpose: while the codes it will rule out are truly nonviable, there will be some nonviable codes that it will fail to rule out. Here we describe a wide range of alternative indices that can be used for ruling codes out. The range includes a continuum from Shannon information to measures of the performance of a Bayesian decoder. We analyze the relationship of these indices to each other and their complementary strengths and weaknesses for addressing this problem. PMID:18533812

  5. Population coding in somatosensory cortex.

    PubMed

    Petersen, Rasmus S; Panzeri, Stefano; Diamond, Mathew E

    2002-08-01

    Computational analyses have begun to elucidate which components of somatosensory cortical population activity may encode basic stimulus features. Recent results from rat barrel cortex suggest that the essence of this code is not synergistic spike patterns, but rather the precise timing of single neuron's first post-stimulus spikes. This may form the basis for a fast, robust population code.

  6. QPhiX Code Generator

    SciTech Connect

    Joo, Balint

    2014-09-16

    A simple code-generator to generate the low level code kernels used by the QPhiX Library for Lattice QCD. Generates Kernels for Wilson-Dslash, and Wilson-Clover kernels. Can be reused to write other optimized kernels for Intel Xeon Phi(tm), Intel Xeon(tm) and potentially other architectures.

  7. Using NAEYC's Code of Ethics.

    ERIC Educational Resources Information Center

    Young Children, 1995

    1995-01-01

    Considers how to deal with an ethical dilemma concerning a caregiver's dislike for a child. Recognizes that no statement in NAEYC's Code of Ethical Conduct requires that a professional must like each child, and presents some ideals and principles from the code that may guide professionals through similar situations. (BAC)

  8. Library for RF Interactions in Orbit Following Codes

    SciTech Connect

    Johnson, T.; Hellsten, T.; Hoeoek, L. J.; Salmi, A.; Steinbrecher, G.; Eriksson, L.-G.; Schneider, M.

    2011-12-23

    A new code-library has been developed to handle quasi-linear wave particle interactions in orbit following Monte Carlo codes, RFOF (RF interactions in Orbit Following codes). This library will enable a large number of orbit following codes to model fast ion acceleration during ICRF and Lower Hybrid heating. The RFOF consists of two main modules: one evaluates the resonance condition, the other the resulting RF acceleration. The resonance condition is tested at each step along the orbit and the location of the next upcoming resonance is predicted. When a particle reaches the resonance, a quasi-linear acceleration is calculated with a novel Monte Carlo technique that avoids the time-consuming evaluation of phase-space derivatives of the interaction strength. In RFOF the wave-particles interactions are assumed to be localized to a single point on the orbit. This is often valid for the ion cyclotron and lower hybrid frequency ranges, but prevents the treatment of bounce and precessional resonances. The RFOF has been developed within the European Task Force for Integrated Tokamak Modelling, enabling interaction between experts in different fields. As a result the code is designed with a simple and generic interface, with a minimum of assumptions on e.g. the geometry. Successful integration with the two orbit following codes, ASCOT and SPOT, has already been demonstrated.

  9. A Comprehensive Validation Approach Using The RAVEN Code

    SciTech Connect

    Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J; Rinaldi, Ivan; Giannetti, Fabio; Caruso, Gianfranco

    2015-06-01

    The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics needed to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.

  10. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  11. System for loading executable code into volatile memory in a downhole tool

    DOEpatents

    Hall, David R.; Bartholomew, David B.; Johnson, Monte L.

    2007-09-25

    A system for loading an executable code into volatile memory in a downhole tool string component comprises a surface control unit comprising executable code. An integrated downhole network comprises data transmission elements in communication with the surface control unit and the volatile memory. The executable code, stored in the surface control unit, is not permanently stored in the downhole tool string component. In a preferred embodiment of the present invention, the downhole tool string component comprises boot memory. In another embodiment, the executable code is an operating system executable code. Preferably, the volatile memory comprises random access memory (RAM). A method for loading executable code to volatile memory in a downhole tool string component comprises sending the code from the surface control unit to a processor in the downhole tool string component over the network. A central processing unit writes the executable code in the volatile memory.

  12. Statistical Coding and Decoding of Heartbeat Intervals

    PubMed Central

    Lucena, Fausto; Barros, Allan Kardec; Príncipe, José C.; Ohnishi, Noboru

    2011-01-01

    The heart integrates neuroregulatory messages into specific bands of frequency, such that the overall amplitude spectrum of the cardiac output reflects the variations of the autonomic nervous system. This modulatory mechanism seems to be well adjusted to the unpredictability of the cardiac demand, maintaining a proper cardiac regulation. A longstanding theory holds that biological organisms facing an ever-changing environment are likely to evolve adaptive mechanisms to extract essential features in order to adjust their behavior. The key question, however, has been to understand how the neural circuitry self-organizes these feature detectors to select behaviorally relevant information. Previous studies in computational perception suggest that a neural population enhances information that is important for survival by minimizing the statistical redundancy of the stimuli. Herein we investigate whether the cardiac system makes use of a redundancy reduction strategy to regulate the cardiac rhythm. Based on a network of neural filters optimized to code heartbeat intervals, we learn a population code that maximizes the information across the neural ensemble. The emerging population code displays filter tuning proprieties whose characteristics explain diverse aspects of the autonomic cardiac regulation, such as the compromise between fast and slow cardiac responses. We show that the filters yield responses that are quantitatively similar to observed heart rate responses during direct sympathetic or parasympathetic nerve stimulation. Our findings suggest that the heart decodes autonomic stimuli according to information theory principles analogous to how perceptual cues are encoded by sensory systems. PMID:21694763

  13. PARAVT: Parallel Voronoi tessellation code

    NASA Astrophysics Data System (ADS)

    González, R. E.

    2016-10-01

    In this study, we present a new open source code for massive parallel computation of Voronoi tessellations (VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition takes into account consistent boundary computation between tasks, and includes periodic conditions. In addition, the code computes neighbors list, Voronoi density, Voronoi cell volume, density gradient for each particle, and densities on a regular grid. Code implementation and user guide are publicly available at https://github.com/regonzar/paravt.

  14. Medical coding in clinical trials.

    PubMed

    Babre, Deven

    2010-01-01

    Data generated in all clinical trial are recorded on the data collection instrument Case report Form / Electronic Case Report Form by investigators located at various sites in various countries. In multicentric clinical trials since different investigator or medically qualified experts are from different sites / centers recording the medical term(s) uniformly is a big challenge. Medical coders from clinical data management team process these terms and perform medical coding. Medical coding is performed to categorize the medical terms reported appropriately so that they can be analyzed/reviewed. This article describes process which is used for medical coding in clinical data management and two most commonly used medical dictionaries MedDRA and WHO-DDE in brief. It is expected to help medical coders to understand the process of medical coding in clinical data management. Few common issues which the medical coder faces while performing medical coding, are also highlighted.

  15. New double-byte error-correcting codes for memory systems

    NASA Technical Reports Server (NTRS)

    Feng, Gui-Liang; Wu, Xinen; Rao, T. R. N.

    1996-01-01

    Error-correcting or error-detecting codes have been used in the computer industry to increase reliability, reduce service costs, and maintain data integrity. The single-byte error-correcting and double-byte error-detecting (SbEC-DbED) codes have been successfully used in computer memory subsystems. There are many methods to construct double-byte error-correcting (DBEC) codes. In the present paper we construct a class of double-byte error-correcting codes, which are more efficient than those known to be optimum, and a decoding procedure for our codes is also considered.

  16. Detecting non-coding selective pressure in coding regions

    PubMed Central

    Chen, Hui; Blanchette, Mathieu

    2007-01-01

    Background Comparative genomics approaches, where orthologous DNA regions are compared and inter-species conserved regions are identified, have proven extremely powerful for identifying non-coding regulatory regions located in intergenic or intronic regions. However, non-coding functional elements can also be located within coding region, as is common for exonic splicing enhancers, some transcription factor binding sites, and RNA secondary structure elements affecting mRNA stability, localization, or translation. Since these functional elements are located in regions that are themselves highly conserved because they are coding for a protein, they generally escaped detection by comparative genomics approaches. Results We introduce a comparative genomics approach for detecting non-coding functional elements located within coding regions. Codon evolution is modeled as a mixture of codon substitution models, where each component of the mixture describes the evolution of codons under a specific type of coding selective pressure. We show how to compute the posterior distribution of the entropy and parsimony scores under this null model of codon evolution. The method is applied to a set of growth hormone 1 orthologous mRNA sequences and a known exonic splicing elements is detected. The analysis of a set of CORTBP2 orthologous genes reveals a region of several hundred base pairs under strong non-coding selective pressure whose function remains unknown. Conclusion Non-coding functional elements, in particular those involved in post-transcriptional regulation, are likely to be much more prevalent than is currently known. With the numerous genome sequencing projects underway, comparative genomics approaches like that proposed here are likely to become increasingly powerful at detecting such elements. PMID:17288582

  17. Developing PYTHON Codes for the Undergraduate ALFALFA Team

    NASA Astrophysics Data System (ADS)

    Troischt, Parker; Ryan, Nicholas; Alfalfa Team

    2016-03-01

    We describe here progress toward developing a number of new PYTHON routines to be used by members of the Undergraduate ALFALFA Team. The codes are designed to analyze HI spectra and assist in identifying and categorizing some of the intriguing sources found in the initial blind ALFALFA survey. Numerical integration is performed on extragalactic sources using 21cm line spectra produced with the L-Band Wide receiver at the National Astronomy and Ionosphere Center. Prior to the integration, polynomial fits are employed to obtain an appropriate baseline for each source. The codes developed here are part of a larger team effort to use new PYTHON routines in order to replace, upgrade, or supplement a wealth of existing IDL codes within the collaboration. This work has been supported by NSF Grant AST-1211005.

  18. Refactoring and Componentizing Legacy Codes: GMI Case Study

    NASA Technical Reports Server (NTRS)

    Kouatchou, Jules; Clune, Thomas; Oloso, Hamid; Sawyer, William; Zhou, Shujia; Damon, Megan; Cruz, Carlos

    2009-01-01

    As Earth science applications grow increasingly complex over time, maintenance of such software becomes a significant challenge. In general, the software subsystems were designed, implemented and integrated (over an extended period of time) by different groups that did not follow the same software design standard. The objective was understandably to obtain scientific results quickly, but this goal was often achieved at the expense of good software practices. With that pattern of development, the cost of incremental changes to the applications grow ominously large. This increase in effort, tends to squeeze development resources and largely prevent any systematic attempt to reintroduce better software practices and thereby enhance the maintainability of the software. In this paper, we present the process we used to incrementally refactor and componentize the Global Modeling Initiative code to make it more understandable, maintainable, flexible and extensible. The resulting product not only preserves the scientific integrity of the original code but also extends the capabilities of the code for several applications.

  19. On-line application of the PANTHER advanced nodal code

    SciTech Connect

    Hutt, P.K.; Knight, M.P. )

    1992-01-01

    Over the last few years, Nuclear Electric has developed an integrated core performance code package for both light water reactors (LWRs) and advanced gas-cooled reactors (AGRs) that can perform a comprehensive range of calculations for fuel cycle design, safety analysis, and on-line operational support for such plants. The package consists of the following codes: WIMS for lattice physics, PANTHER whole reactor nodal flux and AGR thermal hydraulics, VIPRE for LWR thermal hydraulics, and ENIGMA for fuel performance. These codes are integrated within a UNIX-based interactive system called the Reactor Physics Workbench (RPW), which provides an interactive graphic user interface and quality assurance records/data management. The RPW can also control calculational sequences and data flows. The package has been designed to run both off-line and on-line accessing plant data through the RPW.

  20. BASS Code Development

    NASA Technical Reports Server (NTRS)

    Sawyer, Scott

    2004-01-01

    The BASS computational aeroacoustic code solves the fully nonlinear Euler equations in the time domain in two-dimensions. The acoustic response of the stator is determined simultaneously for the first three harmonics of the convected vortical gust of the rotor. The spatial mode generation, propagation and decay characteristics are predicted by assuming the acoustic field away from the stator can be represented as a uniform flow with small harmonic perturbations superimposed. The computed field is then decomposed using a joint temporal-spatial transform to determine the wave amplitudes as a function of rotor harmonic and spatial mode order. This report details the following technical aspects of the computations and analysis. 1) the BASS computational technique; 2) the application of periodic time shifted boundary conditions; 3) the linear theory aspects unique to rotor-stator interactions; and 4) the joint spatial-temporal transform. The computational results presented herein are twofold. In each case, the acoustic response of the stator is determined simultaneously for the first three harmonics of the convected vortical gust of the rotor. The fan under consideration here like modern fans is cut-off at +, and propagating acoustic waves are only expected at 2BPF and 3BPF. In the first case, the computations showed excellent agreement with linear theory predictions. The frequency and spatial mode order of acoustic field was computed and found consistent with linear theory. Further, the propagation of the generated modes was also correctly predicted. The upstream going waves propagated from the domain without reflection from the in ow boundary. However, reflections from the out ow boundary were noticed. The amplitude of the reflected wave was approximately 5% of the incident wave. The second set of computations were used to determine the influence of steady loading on the generated noise. Toward this end, the acoustic response was determined with three steady loading

  1. Status and Benchmarking of the Free Boundary Equilibrium Code FREEBIE

    NASA Astrophysics Data System (ADS)

    Urban, Jakub; Artaud, Jean-Francois; Basiuk, Vincent; Besseghir, Karim; Huynh, Philippe; Kim, Sunhee; Lister, Jonathan Bryan; Nardon, Eric

    2013-10-01

    FREEBIE is a recent free boundary equilibrium (FBE) code, which solves the temporal evolution of tokamak equilibrium, described by the Grad-Shafranov equation and circuit equations for active and passive poloidal field components. FREEBIE can be run stand-alone, within the transport code CRONOS or on the ITM (European Integrated Tokamak Modelling) platform. FREEBIE with prescribed plasma profiles has already been successfully benchmarked against DINA simulations and TCV experiments. Here we report on the current status of the code coupling with transport solvers and benchmarking of fully consistent transport-FBE simulations. A benchmarking procedure is developed and applied to several ITER cases using FREEBIE, DINA and CEDRES++. The benchmarks indicate that because of the different methods and the complexity of the problem, results obtained from the different codes are comparable only to a certain extent. Supported by GACR 13-38121P, EURATOM, AS CR AV0Z 20430508, MSMT 7G10072 and MSMT LM2011021.

  2. Using bar codes for material control and accounting

    SciTech Connect

    Weil, B.

    1997-04-01

    Modern computers have become an important part of almost all business operations, including nuclear material control and accountability (NMC&A). However the effectiveness of any computer hardware/software system is a function of the input data provided to it. To maximize the benefit from a computer, timely (ideally, real-time) and accurate data are required. This paper presents the benefits of using automatic data collection, and more specifically bar code technology. Bar coding is a simple and cost effective keyless data entry solution that has been widely adopted in world commerce and government agencies. Since its introduction to the first MINATOM facility in 1995, bar code activities at other facilities have increased. Tasks to integrate bar code technology with computerized MC&A, equipment, and training workshops have been an important part of the USDOE/MINATOM collaboration.

  3. Coding for effective denial management.

    PubMed

    Miller, Jackie; Lineberry, Joe

    2004-01-01

    Nearly everyone will agree that accurate and consistent coding of diagnoses and procedures is the cornerstone for operating a compliant practice. The CPT or HCPCS procedure code tells the payor what service was performed and also (in most cases) determines the amount of payment. The ICD-9-CM diagnosis code, on the other hand, tells the payor why the service was performed. If the diagnosis code does not meet the payor's criteria for medical necessity, all payment for the service will be denied. Implementation of an effective denial management program can help "stop the bleeding." Denial management is a comprehensive process that works in two ways. First, it evaluates the cause of denials and takes steps to prevent them. Second, denial management creates specific procedures for refiling or appealing claims that are initially denied. Accurate, consistent and compliant coding is key to both of these functions. The process of proactively managing claim denials also reveals a practice's administrative strengths and weaknesses, enabling radiology business managers to streamline processes, eliminate duplicated efforts and shift a larger proportion of the staff's focus from paperwork to servicing patients--all of which are sure to enhance operations and improve practice management and office morale. Accurate coding requires a program of ongoing training and education in both CPT and ICD-9-CM coding. Radiology business managers must make education a top priority for their coding staff. Front office staff, technologists and radiologists should also be familiar with the types of information needed for accurate coding. A good staff training program will also cover the proper use of Advance Beneficiary Notices (ABNs). Registration and coding staff should understand how to determine whether the patient's clinical history meets criteria for Medicare coverage, and how to administer an ABN if the exam is likely to be denied. Staff should also understand the restrictions on use of

  4. Codes of Ethics in Australian Education: Towards a National Perspective

    ERIC Educational Resources Information Center

    Forster, Daniella J.

    2012-01-01

    Teachers have a dual moral responsibility as both values educators and moral agents representing the integrity of the profession. Codes of ethics and conduct in teaching articulate shared professional values and aim to provide some guidance for action around recognised issues special to the profession but are also instruments of regulation which…

  5. Moral Crisis in Higher Institutions and the Dress Code Phenomenon

    ERIC Educational Resources Information Center

    Fayokun, K. O.; Adedeji, S. O.; Oyebade, S. A.

    2009-01-01

    This article reviewed the case of indecent dressing among the youth of today especially on the universities campuses, which has forced the authorities of those institutions to enact dress codes to stem the tide and restore high moral standards, integrity and decency. Whether this bid was successful or not was another thing which was a function of…

  6. Non-Pauli observables for CWS codes

    NASA Astrophysics Data System (ADS)

    Santiago, Douglas F. G.; Portugal, Renato; Melo, Nolmar

    2013-05-01

    It is known that nonadditive quantum codes can have higher code dimensions than stabilizer codes for the same length and minimum distance. The class of codeword stabilized codes (CWS) provides tools to obtain new nonadditive quantum codes by reducing the problem to finding nonlinear classical codes. In this work, we establish some results on the kind of non-Pauli operators that can be used as observables in the decoding scheme of CWS codes and propose a procedure to obtain those observables.

  7. A secure and efficient entropy coding based on arithmetic coding

    NASA Astrophysics Data System (ADS)

    Li, Hengjian; Zhang, Jiashu

    2009-12-01

    A novel security arithmetic coding scheme based on nonlinear dynamic filter (NDF) with changeable coefficients is proposed in this paper. The NDF is employed to generate the pseudorandom number generator (NDF-PRNG) and its coefficients are derived from the plaintext for higher security. During the encryption process, the mapping interval in each iteration of arithmetic coding (AC) is decided by both the plaintext and the initial values of NDF, and the data compression is also achieved with entropy optimality simultaneously. And this modification of arithmetic coding methodology which also provides security is easy to be expanded into the most international image and video standards as the last entropy coding stage without changing the existing framework. Theoretic analysis and numerical simulations both on static and adaptive model show that the proposed encryption algorithm satisfies highly security without loss of compression efficiency respect to a standard AC or computation burden.

  8. CodedStream: live media streaming with overlay coded multicast

    NASA Astrophysics Data System (ADS)

    Guo, Jiang; Zhu, Ying; Li, Baochun

    2003-12-01

    Multicasting is a natural paradigm for streaming live multimedia to multiple end receivers. Since IP multicast is not widely deployed, many application-layer multicast protocols have been proposed. However, all of these schemes focus on the construction of multicast trees, where a relatively small number of links carry the multicast streaming load, while the capacity of most of the other links in the overlay network remain unused. In this paper, we propose CodedStream, a high-bandwidth live media distribution system based on end-system overlay multicast. In CodedStream, we construct a k-redundant multicast graph (a directed acyclic graph) as the multicast topology, on which network coding is applied to work around bottlenecks. Simulation results have shown that the combination of k-redundant multicast graph and network coding may indeed bring significant benefits with respect to improving the quality of live media at the end receivers.

  9. New optimal asymmetric quantum codes constructed from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Lü, Liangdong

    2017-02-01

    In this paper, we propose the construction of asymmetric quantum codes from two families of constacyclic codes over finite field 𝔽q2 of code length n, where for the first family, q is an odd prime power with the form 4t + 1 (t ≥ 1 is integer) or 4t ‑ 1 (t ≥ 2 is integer) and n1 = q2+1 2; for the second family, q is an odd prime power with the form 10t + 3 or 10t + 7 (t ≥ 0 is integer) and n2 = q2+1 5. As a result, families of new asymmetric quantum codes [[n,k,dz/dx

  10. Bilayer Protograph Codes for Half-Duplex Relay Channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; VanNguyen, Thuy; Nosratinia, Aria

    2013-01-01

    Direct to Earth return links are limited by the size and power of lander devices. A standard alternative is provided by a two-hops return link: a proximity link (from lander to orbiter relay) and a deep-space link (from orbiter relay to Earth). Although direct to Earth return links are limited by the size and power of lander devices, using an additional link and a proposed coding for relay channels, one can obtain a more reliable signal. Although significant progress has been made in the relay coding problem, existing codes must be painstakingly optimized to match to a single set of channel conditions, many of them do not offer easy encoding, and most of them do not have structured design. A high-performing LDPC (low-density parity-check) code for the relay channel addresses simultaneously two important issues: a code structure that allows low encoding complexity, and a flexible rate-compatible code that allows matching to various channel conditions. Most of the previous high-performance LDPC codes for the relay channel are tightly optimized for a given channel quality, and are not easily adapted without extensive re-optimization for various channel conditions. This code for the relay channel combines structured design and easy encoding with rate compatibility to allow adaptation to the three links involved in the relay channel, and furthermore offers very good performance. The proposed code is constructed by synthesizing a bilayer structure with a pro to graph. In addition to the contribution to relay encoding, an improved family of protograph codes was produced for the point-to-point AWGN (additive white Gaussian noise) channel whose high-rate members enjoy thresholds that are within 0.07 dB of capacity. These LDPC relay codes address three important issues in an integrative manner: low encoding complexity, modular structure allowing for easy design, and rate compatibility so that the code can be easily matched to a variety of channel conditions without extensive

  11. State building energy codes status

    SciTech Connect

    1996-09-01

    This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.

  12. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  13. Bandwidth efficient coding for satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.

    1992-01-01

    An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.

  14. Verification of fluid-dynamic codes in the presence of shocks and other discontinuities

    NASA Astrophysics Data System (ADS)

    Nathan Woods, C.; Starkey, Ryan P.

    2015-08-01

    The verification that computer codes correctly solve their model equations is critical to the continued success of numerical simulation. The method of manufactured solutions (MMS) is the best method currently available for this kind of verification for differential equations. However, it cannot be used directly with discontinuous solutions, as is required for the verification of high-speed aerodynamic codes with shocks. An integrative approach can extend the applicability of MMS to both discontinuous solutions such as shocks or material interfaces, as well as integral equations. We present an implementation of integrative MMS based on intelligent subdivision of integration domains that is both highly accurate and fast, and results in a rigorous, one-step verification procedure for shock-capturing codes. Numerical integration is found to be accurate to machine precision when tested on exact solutions of the linear heat equation and the Euler equations, even in the presence of discontinuous flow features. Intelligent subdivision of integration domains also improves computational performance by approximately 60× compared to the same algorithm without intelligent subdivisions. We demonstrate the use of MMS in the verification of the BACL-Streamer inviscid gas dynamics code. Integral MMS is found to compute convergence rates that are equivalent to those computed using differential MMS, and comparable to those computed using discontinuous, exact solutions, suggesting integral MMS is a valid method for verification of both integral and shock-capturing codes.

  15. Internationalizing professional codes in engineering.

    PubMed

    Harris, Charles E

    2004-07-01

    Professional engineering societies which are based in the United States, such as the American Society of Mechanical Engineers (ASME, now ASME International) are recognizing that their codes of ethics must apply to engineers working throughout the world. An examination of the ethical code of the ASME International shows that its provisions pose many problems of application, especially in societies outside the United States. In applying the codes effectively in the international environment, two principal issues must be addressed. First, some Culture Transcending Guidelines must be identified and justified. Nine such guidelines are identified Second, some methods for applying the codes to particular situations must be identified Three such methods are specification, balancing, and finding a creative middle way.

  16. FLYCHK Collisional-Radiative Code

    National Institute of Standards and Technology Data Gateway

    SRD 160 FLYCHK Collisional-Radiative Code (Web, free access)   FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.

  17. Seals Flow Code Development 1993

    NASA Technical Reports Server (NTRS)

    Liang, Anita D. (Compiler); Hendricks, Robert C. (Compiler)

    1994-01-01

    Seals Workshop of 1993 code releases include SPIRALI for spiral grooved cylindrical and face seal configurations; IFACE for face seals with pockets, steps, tapers, turbulence, and cavitation; GFACE for gas face seals with 'lift pad' configurations; and SCISEAL, a CFD code for research and design of seals of cylindrical configuration. GUI (graphical user interface) and code usage was discussed with hands on usage of the codes, discussions, comparisons, and industry feedback. Other highlights for the Seals Workshop-93 include environmental and customer driven seal requirements; 'what's coming'; and brush seal developments including flow visualization, numerical analysis, bench testing, T-700 engine testing, tribological pairing and ceramic configurations, and cryogenic and hot gas facility brush seal results. Also discussed are seals for hypersonic engines and dynamic results for spiral groove and smooth annular seals.

  18. NFPA's Hydrogen Technologies Code Project

    SciTech Connect

    Rivkin, C. H.

    2008-12-01

    This article discusses the development of National Fire Protection Association 2 (NFPA), a comprehensive hydrogen safety code. It analyses the contents of this document with particular attention focused on new requirements for setting hydrogen storage systems. These new requirements use computational fluid dynamic modeling and risk assessment procedures to develop requirements that are based on both technical analyses and defined risk criteria. The intent is to develop requirements based on procedures that can be replicated based on the information provided in the code document. This code will require documentation of the modeling inputs and risk criteria and analyses in the supporting information. This article also includes a description of the codes and standards that address hydrogen technologies in general.

  19. Property Control through Bar Coding.

    ERIC Educational Resources Information Center

    Kingma, Gerben J.

    1984-01-01

    A public utility company uses laser wands to read bar-coded labels on furniture and equipment. The system allows an 80 percent savings of the time required to create reports for inventory control. (MLF)

  20. Edge equilibrium code for tokamaks

    SciTech Connect

    Li, Xujing; Drozdov, Vladimir V.

    2014-01-15

    The edge equilibrium code (EEC) described in this paper is developed for simulations of the near edge plasma using the finite element method. It solves the Grad-Shafranov equation in toroidal coordinate and uses adaptive grids aligned with magnetic field lines. Hermite finite elements are chosen for the numerical scheme. A fast Newton scheme which is the same as implemented in the equilibrium and stability code (ESC) is applied here to adjust the grids.

  1. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  2. Product Work Classification and Coding

    DTIC Science & Technology

    1986-06-01

    detail is much more useful in planning steel welding processes. In this regard remember that mild steel , HSLA steel , and high-yield steel (e.g. HY80 ...manufacturing facility. In Figure 2.3-2, a classification and coding system for steel parts is shown. This classification and coding system sorts steel parts...system would provide a shop which produced steel parts with a means of organizing parts. Rather than attempting to manage all of its parts as a single

  3. UNIX code management and distribution

    SciTech Connect

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process.

  4. The integration of system specifications and program coding

    NASA Technical Reports Server (NTRS)

    Luebke, W. R.

    1970-01-01

    Experience in maintaining up-to-date documentation for one module of the large-scale Medical Literature Analysis and Retrieval System 2 (MEDLARS 2) is described. Several innovative techniques were explored in the development of this system's data management environment, particularly those that use PL/I as an automatic documenter. The PL/I data description section can provide automatic documentation by means of a master description of data elements that has long and highly meaningful mnemonic names and a formalized technique for the production of descriptive commentary. The techniques discussed are practical methods that employ the computer during system development in a manner that assists system implementation, provides interim documentation for customer review, and satisfies some of the deliverable documentation requirements.

  5. The Evolution of a Coding Schema in a Paced Program of Research

    ERIC Educational Resources Information Center

    Winters, Charlene A.; Cudney, Shirley; Sullivan, Therese

    2010-01-01

    A major task involved in the management, analysis, and integration of qualitative data is the development of a coding schema to facilitate the analytic process. Described in this paper is the evolution of a coding schema that was used in the analysis of qualitative data generated from online forums of middle-aged women with chronic conditions who…

  6. How Effective Is Honor Code Reporting over Instructor- Implemented Measures? A Pilot Study

    ERIC Educational Resources Information Center

    Barnard-Brak, Lucy; Schmidt, Marcelo; Wei, Tianlan

    2013-01-01

    Honor codes have become increasingly popular at institutions of higher education as a means of reducing violations of academic integrity such as cheating and other academically dishonest acts. Previous research on honor code effectiveness has been limited to correlational research designs that preclude the examination of cause-and-effect…

  7. Automatic Coding of Short Text Responses via Clustering in Educational Assessment

    ERIC Educational Resources Information Center

    Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank

    2016-01-01

    Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…

  8. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  9. Cleanup MAC and MBA code ATP

    SciTech Connect

    Russell, V.K.

    1994-10-17

    The K Basins Materials Accounting (MAC) and Material Balance (MBA) database system had some minor code cleanup performed to its code. This ATP describes how the code was to be tested to verify its correctness.

  10. Codes That Support Smart Growth Development

    EPA Pesticide Factsheets

    Provides examples of local zoning codes that support smart growth development, categorized by: unified development code, form-based code, transit-oriented development, design guidelines, street design standards, and zoning overlay.

  11. Tribal Green Building Administrative Code Example

    EPA Pesticide Factsheets

    This Tribal Green Building Administrative Code Example can be used as a template for technical code selection (i.e., building, electrical, plumbing, etc.) to be adopted as a comprehensive building code.

  12. 75 FR 19944 - International Code Council: The Update Process for the International Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-16

    ... National Institute of Standards and Technology International Code Council: The Update Process for the International Codes and Standards AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The International Code Council (ICC), promulgator of the International Codes...

  13. 78 FR 18321 - International Code Council: The Update Process for the International Codes and Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ... National Institute of Standards and Technology International Code Council: The Update Process for the International Codes and Standards AGENCY: National Institute of Standards and Technology, Commerce. ACTION: Notice. SUMMARY: The International Code Council (ICC), promulgator of the International Codes...

  14. [The QR code in society, economy and medicine--fields of application, options and chances].

    PubMed

    Flaig, Benno; Parzeller, Markus

    2011-01-01

    2D codes like the QR Code ("Quick Response") are becoming more and more common in society and medicine. The application spectrum and benefits in medicine and other fields are described. 2D codes can be created free of charge on any computer with internet access without any previous knowledge. The codes can be easily used in publications, presentations, on business cards and posters. Editors choose between contact details, text or a hyperlink as information behind the code. At expert conferences, linkage by QR Code allows the audience to download presentations and posters quickly. The documents obtained can then be saved, printed, processed etc. Fast access to stored data in the internet makes it possible to integrate additional and explanatory multilingual videos into medical posters. In this context, a combination of different technologies (printed handout, QR Code and screen) may be reasonable.

  15. Beam-splitting code for light scattering by ice crystal particles within geometric-optics approximation

    NASA Astrophysics Data System (ADS)

    Konoshonkin, Alexander V.; Kustova, Natalia V.; Borovoi, Anatoli G.

    2015-10-01

    The open-source beam-splitting code is described which implements the geometric-optics approximation to light scattering by convex faceted particles. This code is written in C++ as a library which can be easy applied to a particular light scattering problem. The code uses only standard components, that makes it to be a cross-platform solution and provides its compatibility to popular Integrated Development Environments (IDE's). The included example of solving the light scattering by a randomly oriented ice crystal is written using Qt 5.1, consequently it is a cross-platform solution, too. Both physical and computational aspects of the beam-splitting algorithm are discussed. Computational speed of the beam-splitting code is obviously higher compared to the conventional ray-tracing codes. A comparison of the phase matrix as computed by our code with the ray-tracing code by A. Macke shows excellent agreement.

  16. Summary of papers on current and anticipated uses of thermal-hydraulic codes

    SciTech Connect

    Caruso, R.

    1997-07-01

    The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especially faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).

  17. Evolution of the genetic code.

    PubMed

    Davis, B K

    1999-01-01

    Comparative path lengths in amino acid biosynthesis and other molecular indicators of the timing of codon assignment were examined to reconstruct the main stages of code evolution. The codon tree obtained was rooted in the 4 N-fixing amino acids (Asp, Glu, Asn, Gln) and 16 triplets of the NAN set. This small, locally phased (commaless) code evidently arose from ambiguous translation on a poly(A) collector strand, in a surface reaction network. Copolymerisation of these amino acids yields polyanionic peptide chains, which could anchor uncharged amide residues to a positively charged mineral surface. From RNA virus structure and replication in vitro, the first genes seemed to be RNA segments spliced into tRNA. Expansion of the code reduced the risk of mutation to an unreadable codon. This step was conditional on initiation at the 5'-codon of a translated sequence. Incorporation of increasingly hydrophobic amino acids accompanied expansion. As codons of the NUN set were assigned most slowly, they received the most nonpolar amino acids. The origin of ferredoxin and Gln synthetase was traced to mid-expansion phase. Surface metabolism ceased by the end of code expansion, as cells bounded by a proteo-phospholipid membrane, with a protoATPase, had emerged. Incorporation of positively charged and aromatic amino acids followed. They entered the post-expansion code by codon capture. Synthesis of efficient enzymes with acid-base catalysis was then possible. Both types of aminoacyl-tRNA synthetases were attributed to this stage. tRNA sequence diversity and error rates in RNA replication indicate the code evolved within 20 million yr in the preIsuan era. These findings on the genetic code provide empirical evidence, from a contemporaneous source, that a surface reaction network, centred on C-fixing autocatalytic cycles, rapidly led to cellular life on Earth.

  18. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  19. Increasing Flexibility in Energy Code Compliance: Performance Packages

    SciTech Connect

    Hart, Philip R.; Rosenberg, Michael I.

    2015-06-30

    Energy codes and standards have provided significant increases in building efficiency over the last 38 years, since the first national energy code was published in late 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. As the code matures, the prescriptive path becomes more complicated, and also more restrictive. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. Performance code paths are increasing in popularity; however, there remains a significant design team overhead in following the performance path, especially for smaller buildings. This paper focuses on development of one alternative format, prescriptive packages. A method to develop building-specific prescriptive packages is reviewed based on a multiple runs of prototypical building models that are used to develop parametric decision analysis to determines a set of packages with equivalent energy performance. The approach is designed to be cost-effective and flexible for the design team while achieving a desired level of energy efficiency performance. A demonstration of the approach based on mid-sized office buildings with two HVAC system types is shown along with a discussion of potential applicability in the energy code process.

  20. A systems neurophysiology approach to voluntary event coding.

    PubMed

    Petruo, Vanessa A; Stock, Ann-Kathrin; Münchau, Alexander; Beste, Christian

    2016-07-15

    Mechanisms responsible for the integration of perceptual events and appropriate actions (sensorimotor processes) have been subject to intense research. Different theoretical frameworks have been put forward with the "Theory of Event Coding (TEC)" being one of the most influential. In the current study, we focus on the concept of 'event files' within TEC and examine what sub-processes being dissociable by means of cognitive-neurophysiological methods are involved in voluntary event coding. This was combined with EEG source localization. We also introduce reward manipulations to delineate the neurophysiological sub-processes most relevant for performance variations during event coding. The results show that processes involved in voluntary event coding included predominantly stimulus categorization, feature unbinding and response selection, which were reflected by distinct neurophysiological processes (the P1, N2 and P3 ERPs). On a system's neurophysiological level, voluntary event-file coding is thus related to widely distributed parietal-medial frontal networks. Attentional selection processes (N1 ERP) turned out to be less important. Reward modulated stimulus categorization in parietal regions likely reflecting aspects of perceptual decision making but not in other processes. The perceptual categorization stage appears central for voluntary event-file coding.

  1. Roadmap for the Future of Commercial Energy Codes

    SciTech Connect

    Rosenberg, Michael I.; Hart, Philip R.; Zhang, Jian; Athalye, Rahul A.

    2015-01-01

    Building energy codes have significantly increased building efficiency over the last 38 years, since the first national energy code was published in 1975. The most commonly used path in energy codes, the prescriptive path, appears to be reaching a point of diminishing returns. The current focus on prescriptive codes has limitations including significant variation in actual energy performance depending on which prescriptive options are chosen, a lack of flexibility for designers and developers, the inability to handle optimization that is specific to building type and use, the inability to account for project-specific energy costs, and the lack of follow-through or accountability after a certificate of occupancy is granted. It is likely that an approach that considers the building as an integrated system will be necessary to achieve the next real gains in building efficiency. This report provides a high-level review of different formats for commercial building energy codes, including prescriptive, prescriptive packages, capacity constrained, outcome based, and predictive performance approaches. This report also explores a next generation commercial energy code approach that places a greater emphasis on performance-based criteria.

  2. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  3. More About Vector Adaptive/Predictive Coding Of Speech

    NASA Technical Reports Server (NTRS)

    Jedrey, Thomas C.; Gersho, Allen

    1992-01-01

    Report presents additional information about digital speech-encoding and -decoding system described in "Vector Adaptive/Predictive Encoding of Speech" (NPO-17230). Summarizes development of vector adaptive/predictive coding (VAPC) system and describes basic functions of algorithm. Describes refinements introduced enabling receiver to cope with errors. VAPC algorithm implemented in integrated-circuit coding/decoding processors (codecs). VAPC and other codecs tested under variety of operating conditions. Tests designed to reveal effects of various background quiet and noisy environments and of poor telephone equipment. VAPC found competitive with and, in some respects, superior to other 4.8-kb/s codecs and other codecs of similar complexity.

  4. 76 FR 77549 - Lummi Nation-Title 20-Code of Laws-Liquor Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-13

    ... Bureau of Indian Affairs Lummi Nation--Title 20--Code of Laws--Liquor Code AGENCY: Bureau of Indian...--Code of Laws--Liquor Code. The Code regulates and controls the possession, sale and consumption of... this Code allows for the possession and sale of alcoholic beverages within the Lummi...

  5. Convergence Acceleration and Documentation of CFD Codes for Turbomachinery Applications

    NASA Technical Reports Server (NTRS)

    Marquart, Jed E.

    2005-01-01

    The development and analysis of turbomachinery components for industrial and aerospace applications has been greatly enhanced in recent years through the advent of computational fluid dynamics (CFD) codes and techniques. Although the use of this technology has greatly reduced the time required to perform analysis and design, there still remains much room for improvement in the process. In particular, there is a steep learning curve associated with most turbomachinery CFD codes, and the computation times need to be reduced in order to facilitate their integration into standard work processes. Two turbomachinery codes have recently been developed by Dr. Daniel Dorney (MSFC) and Dr. Douglas Sondak (Boston University). These codes are entitled Aardvark (for 2-D and quasi 3-D simulations) and Phantom (for 3-D simulations). The codes utilize the General Equation Set (GES), structured grid methodology, and overset O- and H-grids. The codes have been used with success by Drs. Dorney and Sondak, as well as others within the turbomachinery community, to analyze engine components and other geometries. One of the primary objectives of this study was to establish a set of parametric input values which will enhance convergence rates for steady state simulations, as well as reduce the runtime required for unsteady cases. The goal is to reduce the turnaround time for CFD simulations, thus permitting more design parametrics to be run within a given time period. In addition, other code enhancements to reduce runtimes were investigated and implemented. The other primary goal of the study was to develop enhanced users manuals for Aardvark and Phantom. These manuals are intended to answer most questions for new users, as well as provide valuable detailed information for the experienced user. The existence of detailed user s manuals will enable new users to become proficient with the codes, as well as reducing the dependency of new users on the code authors. In order to achieve the

  6. International assessment of PCA codes

    SciTech Connect

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE.

  7. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  8. A genetic scale of reading frame coding.

    PubMed

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code.

  9. ICAN Computer Code Adapted for Building Materials

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  10. Experience with advanced nodal codes at YAEC

    SciTech Connect

    Cacciapouti, R.J.

    1990-01-01

    Yankee Atomic Electric Company (YAEC) has been performing reload licensing analysis since 1969. The basic pressurized water reactor (PWR) methodology involves the use of LEOPARD for cross-section generation, PDQ for radial power distributions and integral control rod worth, and SIMULATE for axial power distributions and differential control rod worth. In 1980, YAEC began performing reload licensing analysis for the Vermont Yankee boiling water reactor (BWR). The basic BWR methodology involves the use of CASMO for cross-section generation and SIMULATE for three-dimensional power distributions. In 1986, YAEC began investigating the use of CASMO-3 for cross-section generation and the advanced nodal code SIMULATE-3 for power distribution analysis. Based on the evaluation, the CASMO-3/SIMULATE-3 methodology satisfied all requirements. After careful consideration, the cost of implementing the new methodology is expected to be offset by reduced computing costs, improved engineering productivity, and fuel-cycle performance gains.

  11. Expanding the genetic code of Mus musculus

    PubMed Central

    Han, Songmi; Yang, Aerin; Lee, Soonjang; Lee, Han-Woong; Park, Chan Bae; Park, Hee-Sung

    2017-01-01

    Here we report the expansion of the genetic code of Mus musculus with various unnatural amino acids including Nɛ-acetyl-lysine. Stable integration of transgenes encoding an engineered Nɛ-acetyl-lysyl-tRNA synthetase (AcKRS)/tRNAPyl pair into the mouse genome enables site-specific incorporation of unnatural amino acids into a target protein in response to the amber codon. We demonstrate temporal and spatial control of protein acetylation in various organs of the transgenic mouse using a recombinant green fluorescent protein (GFPuv) as a model protein. This strategy will provide a powerful tool for systematic in vivo study of cellular proteins in the most commonly used mammalian model organism for human physiology and disease. PMID:28220771

  12. Roadmap to Majorana surface codes

    NASA Astrophysics Data System (ADS)

    Plugge, S.; Landau, L. A.; Sela, E.; Altland, A.; Flensberg, K.; Egger, R.

    2016-11-01

    Surface codes offer a very promising avenue towards fault-tolerant quantum computation. We argue that two-dimensional interacting networks of Majorana bound states in topological superconductor/semiconductor heterostructures hold several key advantages in that direction, concerning both the hardware realization and the actual operation of the code. We here discuss how topologically protected logical qubits in this Majorana surface code architecture can be defined, initialized, manipulated, and read out. All physical ingredients needed to implement these operations are routinely used in topologically trivial quantum devices. By means of quantum interference terms in linear conductance measurements, single-electron pumping protocols, and gate-tunable tunnel barriers, the full set of quantum gates required for universal quantum computation can be achieved. In particular, we show that designated multistep pumping sequences via tunnel-coupled quantum dots realize high-fidelity ancilla states for phase gates.

  13. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  14. A coded tracking telemetry system

    USGS Publications Warehouse

    Howey, P.W.; Seegar, W.S.; Fuller, M.R.; Titus, K.; Amlaner, Charles J.

    1989-01-01

    We describe the general characteristics of an automated radio telemetry system designed to operate for prolonged periods on a single frequency. Each transmitter sends a unique coded signal to a receiving system that encodes and records only the appropriater, pre-programmed codes. A record of the time of each reception is stored on diskettes in a micro-computer. This system enables continuous monitoring of infrequent signals (e.g. one per minute or one per hour), thus extending operation life or allowing size reduction of the transmitter, compared to conventional wildlife telemetry. Furthermore, when using unique codes transmitted on a single frequency, biologists can monitor many individuals without exceeding the radio frequency allocations for wildlife.

  15. FLOWTRAN-TF code description

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  16. FLOWTRAN-TF code description

    SciTech Connect

    Flach, G.P.

    1991-09-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  17. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  18. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  19. The stellar atmosphere simulation code Bifrost. Code description and validation

    NASA Astrophysics Data System (ADS)

    Gudiksen, B. V.; Carlsson, M.; Hansteen, V. H.; Hayek, W.; Leenaarts, J.; Martínez-Sykora, J.

    2011-07-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere, chromosphere, transition region and corona. To understand the details of the atmosphere it is necessary to simulate the whole atmosphere since the different layers interact strongly. These physical regimes are very diverse and it takes a highly efficient massively parallel numerical code to solve the associated equations. Aims: The design, implementation and validation of the massively parallel numerical code Bifrost for simulating stellar atmospheres from the convection zone to the corona. Methods: The code is subjected to a number of validation tests, among them the Sod shock tube test, the Orzag-Tang colliding shock test, boundary condition tests and tests of how the code treats magnetic field advection, chromospheric radiation, radiative transfer in an isothermal scattering atmosphere, hydrogen ionization and thermal conduction. Results.Bifrost completes the tests with good results and shows near linear efficiency scaling to thousands of computing cores.

  20. Simulation of Code Spectrum and Code Flow of Cultured Neuronal Networks.

    PubMed

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime

    2016-01-01

    It has been shown that, in cultured neuronal networks on a multielectrode, pseudorandom-like sequences (codes) are detected, and they flow with some spatial decay constant. Each cultured neuronal network is characterized by a specific spectrum curve. That is, we may consider the spectrum curve as a "signature" of its associated neuronal network that is dependent on the characteristics of neurons and network configuration, including the weight distribution. In the present study, we used an integrate-and-fire model of neurons with intrinsic and instantaneous fluctuations of characteristics for performing a simulation of a code spectrum from multielectrodes on a 2D mesh neural network. We showed that it is possible to estimate the characteristics of neurons such as the distribution of number of neurons around each electrode and their refractory periods. Although this process is a reverse problem and theoretically the solutions are not sufficiently guaranteed, the parameters seem to be consistent with those of neurons. That is, the proposed neural network model may adequately reflect the behavior of a cultured neuronal network. Furthermore, such prospect is discussed that code analysis will provide a base of communication within a neural network that will also create a base of natural intelligence.

  1. Hybrid codes: Methods and applications

    SciTech Connect

    Winske, D. ); Omidi, N. )

    1991-01-01

    In this chapter we discuss hybrid'' algorithms used in the study of low frequency electromagnetic phenomena, where one or more ion species are treated kinetically via standard PIC methods used in particle codes and the electrons are treated as a single charge neutralizing massless fluid. Other types of hybrid models are possible, as discussed in Winske and Quest, but hybrid codes with particle ions and massless fluid electrons have become the most common for simulating space plasma physics phenomena in the last decade, as we discuss in this paper.

  2. Sensor Authentication: Embedded Processor Code

    SciTech Connect

    Svoboda, John

    2012-09-25

    Described is the c code running on the embedded Microchip 32bit PIC32MX575F256H located on the INL developed noise analysis circuit board. The code performs the following functions: Controls the noise analysis circuit board preamplifier voltage gains of 1, 10, 100, 000 Initializes the analog to digital conversion hardware, input channel selection, Fast Fourier Transform (FFT) function, USB communications interface, and internal memory allocations Initiates high resolution 4096 point 200 kHz data acquisition Computes complex 2048 point FFT and FFT magnitude. Services Host command set Transfers raw data to Host Transfers FFT result to host Communication error checking

  3. Guidelines for Coding FORTRAN Programs.

    DTIC Science & Technology

    1982-07-01

    39529 PE63759N I I CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Naval Ocean Research and Development Activity July 1982 Code 320 13. NUMBER OF...Element 63759N. ,-! ri ii Contents 1. INTRODUCTION 1 1.1 Scope 1 2. CODING FORTRAN STATEMENTS 2 2.1 General Remarks and Sug- 2 gestions 2.2 FORTRAN...Assignment 11 5.3 Masking Assignment 11 5.4 Multiple Assignment 11 6. CONTROL STATEMENTS 11 6.1 GO TO Statement 11 6.1.1 Unconditional GO TO 11 Statement

  4. Radio Losses for Concatenated Codes

    NASA Astrophysics Data System (ADS)

    Shambayati, S.

    2002-07-01

    The advent of higher powered spacecraft amplifiers and better ground receivers capable of tracking spacecraft carrier signals with narrower loop bandwidths requires better understanding of the carrier tracking loss (radio loss) mechanism of the concatenated codes used for deep-space missions. In this article, we present results of simulations performed for a (7,1/2), Reed-Solomon (255,223), interleaver depth-5 concatenated code in order to shed some light on this issue. Through these simulations, we obtained the performance of this code over an additive white Gaussian noise (AWGN) channel (the baseline performance) in terms of both its frame-error rate (FER) and its bit-error rate at the output of the Reed-Solomon decoder (RS-BER). After obtaining these results, we curve fitted the baseline performance curves for FER and RS-BER and calculated the high-rate radio losses for this code for an FER of 10^(-4) and its corresponding baseline RS-BER of 2.1 x 10^(-6) for a carrier loop signal-to-noise ratio (SNR) of 14.8 dB. This calculation revealed that even though over the AWGN channel the FER value and the RS-BER value correspond to each other (i.e., these values are obtained by the same bit SNR value), the RS-BER value has higher high-rate losses than does the FER value. Furthermore, this calculation contradicted the previous assumption th at at high data rates concatenated codes have the same radio losses as their constituent convolutional codes. Our results showed much higher losses for the FER and the RS-BER (by as much as 2 dB) than for the corresponding baseline BER of the convolutional code. Further simulations were performed to investigate the effects of changes in the data rate on the code's radio losses. It was observed that as the data rate increased the radio losses for both the FER and the RS-BER approached their respective calculated high-rate values. Furthermore, these simulations showed that a simple two-parameter function could model the increase in the

  5. Benchmark of the IMPACT Code for High Intensity Beam DynamicsSimulation

    SciTech Connect

    Qiang, J.; Ryne, R.D.

    2006-11-16

    The IMPACT (Integrated Map and Particle Accelerator Tracking) code was first developed under Computational Grand Challenge project in the mid 1990s [1]. It started as a three-dimensional (3D) data parallel particle-in-cell (PIC) code written in High Performance Fortran. The code used a split-operator based method to solve the Hamiltonian equations of motion. It contained linear transfer maps for drifts, quadrupole magnets and rf cavities. The space-charge forces were calculated using an FFT-based method with 3D open boundary conditions and longitudinal periodic boundary conditions. This code was completely rewritten in the late 1990s based on a message passing parallel programming paradigm using Fortran 90 and MPI following an object-oriented software design. This improved the code's scalability on large parallel computer systems and also gave the code better software maintainability and extensibility [2]. In the following years, under the SciDAC-1 accelerator project, the code was extended to include more accelerating and focusing elements such as DTL, CCL, superconducting linac, solenoid, dipole, multipoles, and others. Besides the original split-operator based integrator, a direct integration of Lorentz equations of motion using a leap-frog algorithm was also added to the IMPACT code to handle arbitrary external nonlinear fields. This integrator can read in 3D electromagnetic fields in a Cartesian grid or in a cylindrical coordinate system. Using the Lorentz integrator, we also extended the original code to handle multiple charge-state beams. The space-charge solvers were also extended to include conducting wall effects for round and rectangular pipes with longitudinal open and periodic boundary conditions. Recently, it has also been extended to handle short-range wake fields (longitudinal monopole and transverse dipole) and longitudinal coherent synchrotron radiation wake fields. Besides the parallel macroparticle tracking code, an rf linac lattice design code

  6. Theory and Implementation of Nuclear Safety System Codes - Part II: System Code Closure Relations, Validation, and Limitations

    SciTech Connect

    Glenn A Roth; Fatih Aydogan

    2014-09-01

    This is Part II of two articles describing the details of thermal-hydraulic sys- tem codes. In this second part of the article series, the system code closure relationships (used to model thermal and mechanical non-equilibrium and the coupling of the phases) for the governing equations are discussed and evaluated. These include several thermal and hydraulic models, such as heat transfer coefficients for various flow regimes, two phase pressure correlations, two phase friction correlations, drag coefficients and interfacial models be- tween the fields. These models are often developed from experimental data. The experiment conditions should be understood to evaluate the efficacy of the closure models. Code verification and validation, including Separate Effects Tests (SETs) and Integral effects tests (IETs) is also assessed. It can be shown from the assessments that the test cases cover a significant section of the system code capabilities, but some of the more advanced reactor designs will push the limits of validation for the codes. Lastly, the limitations of the codes are discussed by considering next generation power plants, such as Small Modular Reactors (SMRs), analyz- ing not only existing nuclear power plants, but also next generation nuclear power plants. The nuclear industry is developing new, innovative reactor designs, such as Small Modular Reactors (SMRs), High-Temperature Gas-cooled Reactors (HTGRs) and others. Sub-types of these reactor designs utilize pebbles, prismatic graphite moderators, helical steam generators, in- novative fuel types, and many other design features that may not be fully analyzed by current system codes. This second part completes the series on the comparison and evaluation of the selected reactor system codes by discussing the closure relations, val- idation and limitations. These two articles indicate areas where the models can be improved to adequately address issues with new reactor design and development.

  7. Gauge Integration

    DTIC Science & Technology

    2002-09-01

    convergence theorems. Lebesgue developed his theory of measure and integration to address these shortcomings. His integral is more powerful in the...This relatively recent integral possesses the intuitive description of the Riemann integral, with the power of the Lebesgue integral. The purpose of this...strong convergence theorems. Lebesgue developed his theory of measure and integration to address these shortcomings. His integral is more powerful in the

  8. A large scale code resolution service network in the Internet of Things.

    PubMed

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-11-07

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS.

  9. A Large Scale Code Resolution Service Network in the Internet of Things

    PubMed Central

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-01-01

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207

  10. Coding productivity in Sydney public hospitals.

    PubMed

    Dimitropoulos, Vera; Bennett, Adam; McIntosh, Jean

    2002-01-01

    The aims of this study were to compare Sydney public hospitals regarding medical record coding times to compare observed coding times with coding times necessary to avoid backlog and to evaluate the impact on coding time of casemix complexity, coder age, experience, job satisfaction, employment status, and salary. Coding time (in minutes) for each medical record over a two-week period was documented by 61 coders employed in 13 hospitals: six principal referral (PR), six major metropolitan (MM), and one paediatric specialist (PS) hospitals. The mean coding time for each coder was estimated by averaging across coding times for all records during the two-week period. In order to compare hospital mean coding times, the hospitals were grouped into PR and MM/PS groups. The mean coding time necessary to avoid coding backlog (expected coding time) for each hospital group was based on the total number of annual separations and filled full-time equivalent coding positions. The observed mean coding time was longer in the PR group than in the MM/PS group (p = 0.019); however, the observed coding time was within the expected coding time limit in both the PR and MM/PS groups. Casemix complexity tended to influence coding time, but neither age, experience, job satisfaction, employment status nor salary had any impact. In conclusion, the expected coding times, if reliable, indicate that coders in the two hospital groups were keeping coding up-to-date. Thus, the variation between hospital groups in coding time is of little importance, given that the main objective in coding productivity is to maintain the coding workload.

  11. Multiple component codes based generalized LDPC codes for high-speed optical transport.

    PubMed

    Djordjevic, Ivan B; Wang, Ting

    2014-07-14

    A class of generalized low-density parity-check (GLDPC) codes suitable for optical communications is proposed, which consists of multiple local codes. It is shown that Hamming, BCH, and Reed-Muller codes can be used as local codes, and that the maximum a posteriori probability (MAP) decoding of these local codes by Ashikhmin-Lytsin algorithm is feasible in terms of complexity and performance. We demonstrate that record coding gains can be obtained from properly designed GLDPC codes, derived from multiple component codes. We then show that several recently proposed classes of LDPC codes such as convolutional and spatially-coupled codes can be described using the concept of GLDPC coding, which indicates that the GLDPC coding can be used as a unified platform for advanced FEC enabling ultra-high speed optical transport. The proposed class of GLDPC codes is also suitable for code-rate adaption, to adjust the error correction strength depending on the optical channel conditions.

  12. NUFACE: An interface code for the calculation of nuclear responses

    SciTech Connect

    Henderson, D.L. ); Gomes, I.C. )

    1990-01-01

    The NUFACE interface code computes nuclear responses for use in the nuclear analysis of a given tokamak reactor design. The NUFACE code operates on the neutron and gamma fluxes provided by the one-dimensional neutral-particle transport code ONEDANT. Zonewise and zone-boundary responses are computed to obtain both zone-integrated values and maximum surface values. Information on each material mixture within a zone and on each element or isotope constituent of each material is computed. This feature allows for a detailed analysis of the reactor whereby one can easily identify the fractional contribution to the response of interest from each material and each element or isotope. 4 refs., 4 figs., 3 tabs.

  13. Recent SCDAP/RELAP5 code applications and improvements

    SciTech Connect

    Harvego, E.A.; Ghan, L.S.; Knudson, D.L.; Siefken, L.J.

    1998-03-01

    This paper summarizes (1) a recent application of the severe accident analysis code SCDAP/RELAP5/MOD3.1, and (2) development and assessment activities associated with the release of SACDAP/RELAP5/MOD3.2. The Nuclear Regulatory Commission (NRC) has been evaluating the integrity of steam generator tubes during severe accidents. MOD3.1 has been used to support that evaluation. Studies indicate that the pressurizer surge line will fail before any steam generator tubes are damaged. Thus, core decay energy would be released as steam through the surge line and the tube wall would be spared from exposure to prolonged flow of high temperature steam. The latest code version, MOD3.2, contains several improvements to models that address both the early phase and late phase of a severe accident. The impact of these improvements to the overall code capabilities has been assessed. Results of the assessment are summarized in this paper.

  14. An Euler code calculation of blade-vortex interaction noise

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.; Lamkin, S. L.

    1987-01-01

    An Euler code has been developed for calculation of noise radiation due to the interaction of a distributed vortex with a Joukowski airfoil. THe time-dependent incompressible flow field is first determined and then integrated to yield the resulting sound production through use of the elegant low-frequency Green's function approach. This code has several interesting numerical features involved in the vortex motion and in continuous satisfaction of the Kutta condition. In addition, it removes the limitations on Reynolds number and is much more efficient than an earlier Navier-Stokes code. Results indicate that the noise production is due to the deceleration and subsequent acceleration of the vortex as it approaches and passes the airfoil. Predicted acoustic levels and frequencies agree with measured data although a precise comparison would require the strength, size, and position of the incoming vortex to be known.

  15. Design of wavefront coding optical system with annular aperture

    NASA Astrophysics Data System (ADS)

    Chen, Xinhua; Zhou, Jiankang; Shen, Weimin

    2016-10-01

    Wavefront coding can extend the depth of field of traditional optical system by inserting a phase mask into the pupil plane. In this paper, the point spread function (PSF) of wavefront coding system with annular aperture are analyzed. Stationary phase method and fast Fourier transform (FFT) method are used to compute the diffraction integral respectively. The OTF invariance is analyzed for the annular aperture with cubic phase mask under different obscuration ratio. With these analysis results, a wavefront coding system using Maksutov-Cassegrain configuration is designed finally. It is an F/8.21 catadioptric system with annular aperture, and its focal length is 821mm. The strength of the cubic phase mask is optimized with user-defined operand in Zemax. The Wiener filtering algorithm is used to restore the images and the numerical simulation proves the validity of the design.

  16. Data transmission via erasure type channels protected by linear codes

    NASA Astrophysics Data System (ADS)

    Wacker, H. D.; Pendli, P.; Börcsök, J.

    2012-05-01

    The paper is concerned with data transmission via channels composed of a memoryless binary symmetric channel and the erasure channel of Peter Elias. Channels of this type play an important role in modelling different types of networks especially wireless networks, and have been investigated using, amongst others, the theory of Markov chains. Channel Capacities and network flows have been determined. The authors focus their interest on some aspects of coding theory. They assume the data transmission to be protected by a linear code, a CRC for example, and determine the probability of undetected error of the code. They then consider redundant transmission via two or more channels with bit inversion, and calculate the probability of undetected error. They prove some inequalities that are useful instruments to estimate the rate of transmission errors and to determine safety integrity levels according to the standards. Finally the authors apply their results to Bluetooth channels suffering from different types of noise.

  17. Software Model Checking of ARINC-653 Flight Code with MCP

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah J.; Brat, Guillaume; Venet, Arnaud

    2010-01-01

    The ARINC-653 standard defines a common interface for Integrated Modular Avionics (IMA) code. In particular, ARINC-653 Part 1 specifies a process- and partition-management API that is analogous to POSIX threads, but with certain extensions and restrictions intended to support the implementation of high reliability flight code. MCP is a software model checker, developed at NASA Ames, that provides capabilities for model checking C and C++ source code. In this paper, we present recent work aimed at implementing extensions to MCP that support ARINC-653, and we discuss the challenges and opportunities that consequentially arise. Providing support for ARINC-653 s time and space partitioning is nontrivial, though there are implicit benefits for partial order reduction possible as a consequence of the API s strict interprocess communication policy.

  18. SPQR: a Monte Carlo reactor kinetics code. [LMFBR

    SciTech Connect

    Cramer, S.N.; Dodds, H.L.

    1980-02-01

    The SPQR Monte Carlo code has been developed to analyze fast reactor core accident problems where conventional methods are considered inadequate. The code is based on the adiabatic approximation of the quasi-static method. This initial version contains no automatic material motion or feedback. An existing Monte Carlo code is used to calculate the shape functions and the integral quantities needed in the kinetics module. Several sample problems have been devised and analyzed. Due to the large statistical uncertainty associated with the calculation of reactivity in accident simulations, the results, especially at later times, differ greatly from deterministic methods. It was also found that in large uncoupled systems, the Monte Carlo method has difficulty in handling asymmetric perturbations.

  19. Edge-relevant plasma simulations with the continuum code COGENT

    NASA Astrophysics Data System (ADS)

    Dorf, M.; Dorr, M.; Ghosh, D.; Hittinger, J.; Rognlien, T.; Cohen, R.; Lee, W.; Schwartz, P.

    2016-10-01

    We describe recent advances in cross-separatrix and other edge-relevant plasma simulations with COGENT, a continuum gyro-kinetic code being developed by the Edge Simulation Laboratory (ESL) collaboration. The distinguishing feature of the COGENT code is its high-order finite-volume discretization methods, which employ arbitrary mapped multiblock grid technology (nearly field-aligned on blocks) to handle the complexity of tokamak divertor geometry with high accuracy. This paper discusses the 4D (axisymmetric) electrostatic version of the code, and the presented topics include: (a) initial simulations with kinetic electrons and development of reduced fluid models; (b) development and application of implicit-explicit (IMEX) time integration schemes; and (c) conservative modeling of drift-waves and the universal instability. Work performed for USDOE, at LLNL under contract DE-AC52-07NA27344 and at LBNL under contract DE-AC02-05CH11231.

  20. 49 CFR 178.702 - IBC codes.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false IBC codes. 178.702 Section 178.702 Transportation...-Oriented Standards § 178.702 IBC codes. (a) Intermediate bulk container code designations consist of: two... the category of intermediate bulk container. (1) IBC code number designations are as follows: Type...

  1. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  2. 7 CFR 201.24 - Code designation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Code designation. 201.24 Section 201.24 Agriculture... REGULATIONS Labeling Agricultural Seeds § 201.24 Code designation. The code designation used in lieu of the... as may be designated by him for the purpose. When used, the code designation shall appear on...

  3. 7 CFR 201.28 - Code designation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Code designation. 201.28 Section 201.28 Agriculture... REGULATIONS Labeling Vegetable Seeds § 201.28 Code designation. The code designation used in lieu of the full... as may be designated by him for the purpose. When used, the code designation shall appear on...

  4. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  5. Integrated Means Integrity

    ERIC Educational Resources Information Center

    Odegard, John D.

    1978-01-01

    Describes the operation of the Cessna Pilot Center (CPC) flight training systems. The program is based on a series of integrated activities involving stimulus, response, reinforcement and association components. Results show that the program can significantly reduce in-flight training time. (CP)

  6. Establishment and assessment of code scaling capability

    NASA Astrophysics Data System (ADS)

    Lim, Jaehyok

    In this thesis, a method for using RELAP5/MOD3.3 (Patch03) code models is described to establish and assess the code scaling capability and to corroborate the scaling methodology that has been used in the design of the Purdue University Multi-Dimensional Integral Test Assembly for ESBWR applications (PUMA-E) facility. It was sponsored by the United States Nuclear Regulatory Commission (USNRC) under the program "PUMA ESBWR Tests". PUMA-E facility was built for the USNRC to obtain data on the performance of the passive safety systems of the General Electric (GE) Nuclear Energy Economic Simplified Boiling Water Reactor (ESBWR). Similarities between the prototype plant and the scaled-down test facility were investigated for a Gravity-Driven Cooling System (GDCS) Drain Line Break (GDLB). This thesis presents the results of the GDLB test, i.e., the GDLB test with one Isolation Condenser System (ICS) unit disabled. The test is a hypothetical multi-failure small break loss of coolant (SB LOCA) accident scenario in the ESBWR. The test results indicated that the blow-down phase, Automatic Depressurization System (ADS) actuation, and GDCS injection processes occurred as expected. The GDCS as an emergency core cooling system provided adequate supply of water to keep the Reactor Pressure Vessel (RPV) coolant level well above the Top of Active Fuel (TAF) during the entire GDLB transient. The long-term cooling phase, which is governed by the Passive Containment Cooling System (PCCS) condensation, kept the reactor containment system that is composed of Drywell (DW) and Wetwell (WW) below the design pressure of 414 kPa (60 psia). In addition, the ICS continued participating in heat removal during the long-term cooling phase. A general Code Scaling, Applicability, and Uncertainty (CSAU) evaluation approach was discussed in detail relative to safety analyses of Light Water Reactor (LWR). The major components of the CSAU methodology that were highlighted particularly focused on the

  7. Fast-neutron, coded-aperture imager

    NASA Astrophysics Data System (ADS)

    Woolf, Richard S.; Phlips, Bernard F.; Hutcheson, Anthony L.; Wulf, Eric A.

    2015-06-01

    This work discusses a large-scale, coded-aperture imager for fast neutrons, building off a proof-of concept instrument developed at the U.S. Naval Research Laboratory (NRL). The Space Science Division at the NRL has a heritage of developing large-scale, mobile systems, using coded-aperture imaging, for long-range γ-ray detection and localization. The fast-neutron, coded-aperture imaging instrument, designed for a mobile unit (20 ft. ISO container), consists of a 32-element array of 15 cm×15 cm×15 cm liquid scintillation detectors (EJ-309) mounted behind a 12×12 pseudorandom coded aperture. The elements of the aperture are composed of 15 cm×15 cm×10 cm blocks of high-density polyethylene (HDPE). The arrangement of the aperture elements produces a shadow pattern on the detector array behind the mask. By measuring of the number of neutron counts per masked and unmasked detector, and with knowledge of the mask pattern, a source image can be deconvolved to obtain a 2-d location. The number of neutrons per detector was obtained by processing the fast signal from each PMT in flash digitizing electronics. Digital pulse shape discrimination (PSD) was performed to filter out the fast-neutron signal from the γ background. The prototype instrument was tested at an indoor facility at the NRL with a 1.8-μCi and 13-μCi 252Cf neutron/γ source at three standoff distances of 9, 15 and 26 m (maximum allowed in the facility) over a 15-min integration time. The imaging and detection capabilities of the instrument were tested by moving the source in half- and one-pixel increments across the image plane. We show a representative sample of the results obtained at one-pixel increments for a standoff distance of 9 m. The 1.8-μCi source was not detected at the 26-m standoff. In order to increase the sensitivity of the instrument, we reduced the fastneutron background by shielding the top, sides and back of the detector array with 10-cm-thick HDPE. This shielding configuration led

  8. Development of probabilistic internal dosimetry computer code

    NASA Astrophysics Data System (ADS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-02-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values ( e.g. the 2.5th, 5th, median, 95th, and 97.5th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases of

  9. MARS-KS code validation activity through the atlas domestic standard problem

    SciTech Connect

    Choi, K. Y.; Kim, Y. S.; Kang, K. H.; Park, H. S.; Cho, S.

    2012-07-01

    The 2 nd Domestic Standard Problem (DSP-02) exercise using the ATLAS integral effect test data was executed to transfer the integral effect test data to domestic nuclear industries and to contribute to improving the safety analysis methodology for PWRs. A small break loss of coolant accident of a 6-inch break at the cold leg was determined as a target scenario by considering its technical importance and by incorporating interests from participants. Ten calculation results using MARS-KS code were collected, major prediction results were described qualitatively and code prediction accuracy was assessed quantitatively using the FFTBM. In addition, special code assessment activities were carried out to find out the area where the model improvement is required in the MARS-KS code. The lessons from this DSP-02 and recommendations to code developers are described in this paper. (authors)

  10. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  11. Multichannel error correction code decoder

    NASA Technical Reports Server (NTRS)

    Wagner, Paul K.; Ivancic, William D.

    1993-01-01

    A brief overview of a processing satellite for a mesh very-small-aperture (VSAT) communications network is provided. The multichannel error correction code (ECC) decoder system, the uplink signal generation and link simulation equipment, and the time-shared decoder are described. The testing is discussed. Applications of the time-shared decoder are recommended.

  12. Tri-Coding of Information.

    ERIC Educational Resources Information Center

    Simpson, Timothy J.

    Paivio's Dual Coding Theory has received widespread recognition for its connection between visual and aural channels of internal information processing. The use of only two channels, however, cannot satisfactorily explain the effects witnessed every day. This paper presents a study suggesting the presence a third, kinesthetic channel, currently…

  13. Three-dimensional stellarator codes

    PubMed Central

    Garabedian, P. R.

    2002-01-01

    Three-dimensional computer codes have been used to develop quasisymmetric stellarators with modular coils that are promising candidates for a magnetic fusion reactor. The mathematics of plasma confinement raises serious questions about the numerical calculations. Convergence studies have been performed to assess the best configurations. Comparisons with recent data from large stellarator experiments serve to validate the theory. PMID:12140367

  14. Overview of CODE V development

    NASA Astrophysics Data System (ADS)

    Harris, Thomas I.

    1991-01-01

    This paper is part of a session that is aimed at briefly describing some of today''s optical design software packages with emphasis on the program''s philosophy and technology. CODE V is the ongoing result of a development process that began in the 1960''s it is now the result of many people''s efforts. This paper summarizes the roots of the program some of its history dominant philosophies and technologies that have contributed to its usefulness and some that drive its continued development. ROOTS OF CODE V Conceived in the early 60''s This was at a time when there was skepticism that " automatic design" could design lenses equal or better than " hand" methods. The concepts underlying CODE V and its predecessors were based on ten years of experience and exposure to the problems of a group of lens designers in a design-for-manufacture environment. The basic challenge was to show that lens design could be done better easier and faster by high quality computer-assisted design tools. The earliest development was for our own use as an engineering services organization -an in-house tool for custom design. As a tool it had to make us efficient in providing lens design and engineering services as a self-sustaining business. PHILOSOPHY OF OVTIM!ZATION IN CODE V Error function formation Based on experience as a designer we felt very strongly that there should be a clear separation of

  15. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  16. Changing Postal ZIP Code Boundaries

    DTIC Science & Technology

    2006-06-23

    for distribution to a specific delivery post office, identified by the fourth and fifth digits. For example, the ZIP Code for Alturas , the county seat...distribution point for some California post offices such as Alturas , Cedarville (96104), Fort Bidwell (96112), and Likely (96116), distinguished by the

  17. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  18. QR Codes: Taking Collections Further

    ERIC Educational Resources Information Center

    Ahearn, Caitlin

    2014-01-01

    With some thought and direction, QR (quick response) codes are a great tool to use in school libraries to enhance access to information. From March through April 2013, Caitlin Ahearn interned at Sanborn Regional High School (SRHS) under the supervision of Pam Harland. As a result of Harland's un-Deweying of the nonfiction collection at SRHS,…

  19. Generating Constant Weight Binary Codes

    ERIC Educational Resources Information Center

    Knight, D.G.

    2008-01-01

    The determination of bounds for A(n, d, w), the maximum possible number of binary vectors of length n, weight w, and pairwise Hamming distance no less than d, is a classic problem in coding theory. Such sets of vectors have many applications. A description is given of how the problem can be used in a first-year undergraduate computational…

  20. Dress Codes and Gang Activity.

    ERIC Educational Resources Information Center

    Gluckman, Ivan B.

    1996-01-01

    Concern with school violence and efforts to reduce gang visibility at school have led to controversy about students' constitutional rights to freedom of expression. This document outlines legal precedents and offers guidelines for developing a sound school policy on dress codes. It answers the following questions: (1) Are gang clothing and symbols…