Science.gov

Sample records for non-blood product codes

  1. Production code control system for hydrodynamics simulations

    SciTech Connect

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration management system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.

  2. Number of minimum-weight code words in a product code

    NASA Technical Reports Server (NTRS)

    Miller, R. L.

    1978-01-01

    Consideration is given to the number of minimum-weight code words in a product code. The code is considered as a tensor product of linear codes over a finite field. Complete theorems and proofs are presented.

  3. Distributed Turbo Product Codes with Multiple Vertical Parities

    NASA Astrophysics Data System (ADS)

    Obiedat, Esam A.; Chen, Guotai; Cao, Lei

    2009-12-01

    We propose a Multiple Vertical Parities Distributed Turbo Product Code (MVP-DTPC) over cooperative network using block Bose Chaudhuri Hochquenghem (BCH) codes as component codes. The source broadcasts extended BCH coded frames to the destination and nearby relays. After decoding the received sequences, each relay constructs a product code by arranging the corrected bit sequences in rows and re-encoding them vertically using BCH as component codes to obtain an Incremental Redundancy (IR) for source's data. To obtain independent vertical parities from each relay in the same code space, we propose a new Circular Interleaver for source's data; different circular interleavers are used to interleave BCH rows before re-encoding vertically. The Maximum A posteriori Probability (MAP) decoding is achieved by applying maximum transfer of extrinsic information between the multiple decoding stages. This is employed in the modified turbo product decoder, which is proposed to cope with multiple parities. The a posteriori output from a vertical decoding stage is used to derive the soft extrinsic information, that are used as a priori input for the next horizontal decoding stage. Simulation results in Additive White Gaussian Noise (AWGN) channel using network scenarios show 0.3-0.5 dB gain improvement in Bit Error Rate (BER) performance over the non-cooperative Turbo Product Codes (TPC).

  4. The First Generation NIMROD Production Code

    NASA Astrophysics Data System (ADS)

    Sovinec, C. R.

    1997-11-01

    The NIMROD code (Non-Ideal, MHD with Rotation, an Open Discussion project) is a numerical tool for studying nonlinear three-dimensional, low-frequency plasma behavior. It has been written with modular Fortran 90 programming to maximize geometric and mathematical flexibility. The spatial representation consists of 2D finite elements for the poloidal plane and Fourier series for the perpendicular direction, which is assumed to be periodic (either toroidal or linear). The plane of finite elements is decomposed into an unstructured collection of blocks, where each contains either structured quadrilateral elements (for efficiency) or unstructured triangular elements (for geometric flexibility). The block decomposition also permits computing on parallel machines with message-passing architecture. The present implementation is a time-split, semi-implicit algorithm based on generalized single-fluid equations that are analytically equivalent to a two-fluid system. Neoclassical effects in tokamaks are computed with an analytic form of the parallel viscous stress for the Braginskii and plateau regimes.(S. P. Hirschman and D. J. Sigmar, Nucl. Fusion 9, 1079 (1981).) The code has been validated on MHD instabilities in addition to MHD and whistler waves. Nonlinear benchmarking includes comparison of mode saturation for tokamak and reversed-field pinch (RFP) safety factor profiles.

  5. HIDUTYDRV Code, A Fuel Product Margin Tool

    SciTech Connect

    Krammen, Michael A.; Karoutas, Zeses E.; Grill, Steven F.; Sutharshan, Balendra

    2007-07-01

    HIDUTYDRV is a computer code currently used in core design to model the best estimate steady-state fuel rod corrosion performance for Westinghouse's CE-design 14x14 and 16x16 fuel. The fuel rod oxide thickness, sub-cooled nucleate boiling (referred to as mass evaporation or steaming), and fuel duty indices can be predicted for individual rods or up to every fuel rod in the quarter core at every nuclear fuel management depletion time-step as a function of axial elevation within the core. Best estimate operating margins for fuel components whose performance depends on the local power and thermal hydraulic conditions are candidates for analysis with HIDUTYDRV. HIDUTYDRV development will focus on fuel component parameters associated with known leakers for addressing INPO goals to eliminate fuel leakers by 2010. (authors)

  6. Benchmarking of Neutron Production of Heavy-Ion Transport Codes

    SciTech Connect

    Remec, Igor; Ronningen, Reginald M.; Heilbronn, Lawrence

    2012-01-01

    Accurate prediction of radiation fields generated by heavy ion interactions is important in medical applications, space missions, and in design and operation of rare isotope research facilities. In recent years, several well-established computer codes in widespread use for particle and radiation transport calculations have been equipped with the capability to simulate heavy ion transport and interactions. To assess and validate these capabilities, we performed simulations of a series of benchmark-quality heavy ion experiments with the computer codes FLUKA, MARS15, MCNPX, and PHITS. We focus on the comparisons of secondary neutron production. Results are encouraging; however, further improvements in models and codes and additional benchmarking are required.

  7. Phonological Codes Constrain Output of Orthographic Codes via Sublexical and Lexical Routes in Chinese Written Production

    PubMed Central

    Wang, Cheng; Zhang, Qingfang

    2015-01-01

    To what extent do phonological codes constrain orthographic output in handwritten production? We investigated how phonological codes constrain the selection of orthographic codes via sublexical and lexical routes in Chinese written production. Participants wrote down picture names in a picture-naming task in Experiment 1or response words in a symbol—word associative writing task in Experiment 2. A sublexical phonological property of picture names (phonetic regularity: regular vs. irregular) in Experiment 1and a lexical phonological property of response words (homophone density: dense vs. sparse) in Experiment 2, as well as word frequency of the targets in both experiments, were manipulated. A facilitatory effect of word frequency was found in both experiments, in which words with high frequency were produced faster than those with low frequency. More importantly, we observed an inhibitory phonetic regularity effect, in which low-frequency picture names with regular first characters were slower to write than those with irregular ones, and an inhibitory homophone density effect, in which characters with dense homophone density were produced more slowly than those with sparse homophone density. Results suggested that phonological codes constrained handwritten production via lexical and sublexical routes. PMID:25879662

  8. 50 CFR Table 1c to Part 679 - Product Tyoe Codes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Product Tyoe Codes 1c Table 1c to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION... Table 1c to Part 679—Product Tyoe Codes Description Code Ancillary product.A product, such as...

  9. 76 FR 4113 - Federal Procurement Data System Product Service Code Manual Update

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-24

    ... ADMINISTRATION Federal Procurement Data System Product Service Code Manual Update AGENCY: Office of... the Products and Services Code (PSC) Manual, which provides codes to describe products, services, and... maintains the PSC Manual, is in the process of updating the manual. DATES: Effective date: January 24,...

  10. Basic Business and Economics: Understanding the Uses of the Universal Product Code

    ERIC Educational Resources Information Center

    Blockhus, Wanda

    1977-01-01

    Describes the Universal Product Code (UPC), the two-part food labeling and packaging code which is both human- and electronic scanner-readable. Discusses how it affects both consumer and business, and suggests how to teach the UPC code to business education students. (HD)

  11. 78 FR 21612 - Medical Device Classification Product Codes; Guidance for Industry and Food and Drug...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-11

    ...The Food and Drug Administration (FDA) is announcing the availability of the guidance entitled ``Medical Device Classification Product Codes.'' This document describes how device product codes are used in a variety of FDA program areas to regulate and track medical devices regulated by the Center for Devices and Radiological Health (CDRH) and the Center for Biologics Evaluation and Research......

  12. 50 CFR Table 3b to Part 680 - Crab Disposition or Product Codes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Crab Disposition or Product Codes 3b Table 3b to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND... ZONE OFF ALASKA Pt. 680, Table 3b Table 3b to Part 680—Crab Disposition or Product Codes...

  13. 50 CFR Table 3b to Part 680 - Crab Disposition or Product Codes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Crab Disposition or Product Codes 3b Table 3b to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND... ZONE OFF ALASKA Pt. 680, Table 3b Table 3b to Part 680—Crab Disposition or Product Codes...

  14. 50 CFR Table 3b to Part 680 - Crab Disposition or Product Codes

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 50 Wildlife and Fisheries 13 2014-10-01 2014-10-01 false Crab Disposition or Product Codes 3b Table 3b to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND... ZONE OFF ALASKA Pt. 680, Table 3b Table 3b to Part 680—Crab Disposition or Product Codes...

  15. 50 CFR Table 3b to Part 680 - Crab Disposition or Product Codes

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Crab Disposition or Product Codes 3b Table 3b to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND... ZONE OFF ALASKA Pt. 680, Table 3b Table 3b to Part 680—Crab Disposition or Product Codes...

  16. 50 CFR Table 3b to Part 680 - Crab Disposition or Product Codes

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Crab Disposition or Product Codes 3b Table 3b to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND... ZONE OFF ALASKA Pt. 680, Table 3b Table 3b to Part 680—Crab Disposition or Product Codes...

  17. 50 CFR Table 3c to Part 680 - Crab Product Codes for Economic Data Reports

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Crab Product Codes for Economic Data Reports 3c Table 3c to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt. 680, Table 3c Table 3c to Part 680—Crab Product Codes for...

  18. 50 CFR Table 3c to Part 680 - Crab Product Codes for Economic Data Reports

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Crab Product Codes for Economic Data Reports 3c Table 3c to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt. 680, Table 3c Table 3c to Part 680—Crab Product Codes for...

  19. 50 CFR Table 3c to Part 680 - Crab Product Codes for Economic Data Reports

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 50 Wildlife and Fisheries 9 2010-10-01 2010-10-01 false Crab Product Codes for Economic Data Reports 3c Table 3c to Part 680 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL... EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt. 680, Table 3c Table 3c to Part 680—Crab Product Codes for...

  20. Product code optimization for determinate state LDPC decoding in robust image transmission.

    PubMed

    Thomos, Nikolaos; Boulgouris, Nikolaos V; Strintzis, Michael G

    2006-08-01

    We propose a novel scheme for error-resilient image transmission. The proposed scheme employs a product coder consisting of low-density parity check (LDPC) codes and Reed-Solomon codes in order to deal effectively with bit errors. The efficiency of the proposed scheme is based on the exploitation of determinate symbols in Tanner graph decoding of LDPC codes and a novel product code optimization technique based on error estimation. Experimental evaluation demonstrates the superiority of the proposed system in comparison to recent state-of-the-art techniques for image transmission. PMID:16900669

  1. ACFAC: a cash flow analysis code for estimating product price from an industrial operation

    SciTech Connect

    Delene, J.G.

    1980-04-01

    A computer code is presented which uses a discountted cash flow methodology to obtain an average product price for an industtrial process. The general discounted cash flow method is discussed. Special code options include multiple treatments of interest during construction and other preoperational costs, investment tax credits, and different methods for tax depreciation of capital assets. Two options for allocating the cost of plant decommissioning are available. The FORTRAN code listing and the computer output for a sample problem are included.

  2. Efficient Bar-Code Watermark System to Protectagricultural Products Information Andcopyright

    NASA Astrophysics Data System (ADS)

    Deng, Lin; Wen, Xiaoming

    In order to protect agricultural product information and copyright, this paper proposes an efficient bar-code watermark system with digital signature. The proposed system adopts digital signature to prevent a buyer from unauthorized copies and to prevent a seller from forged unauthorized copies. The proposed system also encodes the signature with bar-code and embeds the bar-code image into the original image. As long as the similarity of watermark extracts from the damaged image over a threshold, the signature can be fully recovered. It is a novel idea to bring the bar-code concept into watermark system to protect agricultural product information and copyright. Detailed simulation results show that the proposed system gets much better results than that with error correcting code scheme, and prove that the proposed system can protect agricultural product information and copyright effectively.

  3. Assessement of Codes and Standards Applicable to a Hydrogen Production Plant Coupled to a Nuclear Reactor

    SciTech Connect

    M. J. Russell

    2006-06-01

    This is an assessment of codes and standards applicable to a hydrogen production plant to be coupled to a nuclear reactor. The result of the assessment is a list of codes and standards that are expected to be applicable to the plant during its design and construction.

  4. Annual Coded Wire Tag Program; Washington Missing Production Groups, 1996 Annual Report.

    SciTech Connect

    Byrne, James; Fuss, Howard J.; Ashbrook, Charmane

    1997-10-01

    The intent of the funding is to coded-wire tag at least one production group of each species at each Columbia Basin hatchery to provide a holistic assessment of survival and catch distribution over time.

  5. Annual Coded Wire Tag Program; Washington Missing Production Groups, 1998 Annual Report.

    SciTech Connect

    Byrne, James; Fuss, Howard J.

    1999-10-01

    The Bonneville Power Administration (BPA) funds the ``Annual Coded Wire Tag Program--Missing Production Groups for Columbia River Hatcheries'' project. The WDFW project has three main objectives: (1) coded-wire tag at least one production group of each species at each Columbia Basin hatchery to enable evaluation of survival and catch distribution over time, (2) recover coded-wire tags from the snouts of fish tagged under objective 1 and estimate survival, contribution, and stray rates for each group, and (3) report the findings under objective 2 for all broods of chinook, and coho released from WDFW Columbia Basin hatcheries.

  6. Signature Product Code for Predicting Protein-Protein Interactions

    SciTech Connect

    Martin, Shawn B.; Brown, William M.

    2004-09-25

    The SigProdV1.0 software consists of four programs which together allow the prediction of protein-protein interactions using only amino acid sequences and experimental data. The software is based on the use of tensor products of amino acid trimers coupled with classifiers known as support vector machines. Essentially the program looks for amino acid trimer pairs which occur more frequently in protein pairs which are known to interact. These trimer pairs are then used to make predictions about unknown protein pairs. A detailed description of the method can be found in the paper: S. Martin, D. Roe, J.L. Faulon. "Predicting protein-protein interactions using signature products," Bioinformatics, available online from Advance Access, Aug. 19, 2004.

  7. Signature Product Code for Predicting Protein-Protein Interactions

    Energy Science and Technology Software Center (ESTSC)

    2004-09-25

    The SigProdV1.0 software consists of four programs which together allow the prediction of protein-protein interactions using only amino acid sequences and experimental data. The software is based on the use of tensor products of amino acid trimers coupled with classifiers known as support vector machines. Essentially the program looks for amino acid trimer pairs which occur more frequently in protein pairs which are known to interact. These trimer pairs are then used to make predictionsmore » about unknown protein pairs. A detailed description of the method can be found in the paper: S. Martin, D. Roe, J.L. Faulon. "Predicting protein-protein interactions using signature products," Bioinformatics, available online from Advance Access, Aug. 19, 2004.« less

  8. Neutron Activation Analysis and Product Isotope Inventory Code System.

    Energy Science and Technology Software Center (ESTSC)

    1990-10-31

    Version 00 NAC was designed to predict the neutron-induced gamma-ray radioactivity for a wide variety of composite materials. The NAC output includes the input data, a list of all reactions for each constituent element, and the end-of-irradiation disintegration rates for each reaction. NAC also compiles a product isotope inventory containing the isotope name, the disintegration rate, the gamma-ray source strength, and the absorbed dose rate at 1 meter from an unshielded point source. The inducedmore » activity is calculated as a function of irradiation and decay times; the effect of cyclic irradiation can also be calculated.« less

  9. Turbo product codes and their application in the fourth-generation mobile communication system

    NASA Astrophysics Data System (ADS)

    He, Yejun; Zhu, Guangxi; Liu, Ying Zhuang; Liu, Jian

    2004-04-01

    In this paper, we firstly present turbo product codes (TPCs) for forward error correction (FEC) coding, including TPCs encoding process and decoding principle, and then compare TPCs with turbo convolutional codes (TCCs) error coding solution. The performance of TPCs is shown to be closer to the Shannon limit than TCCs. Secondly, we introduce TPCs" application in the 4th generation (4G) mobile communication system which is being developed in our country at present. The concept of TPC-OFDM system which promises higher code rate than conventional OFDM is first modified. Finally, simulation results show that the simplified 4G uplink systems offer Bit Error Rate of nearly 0 over IMT-2000 channel at Eb/N0 > 15dB.

  10. 50 CFR Table 1c to Part 679 - Product Type Codes

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Product Type Codes 1c Table 1c to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt. 679, Table 1c Table 1c to Part 679—Product...

  11. 76 FR 66235 - Bar Code Technologies for Drugs and Biological Products; Retrospective Review Under Executive...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-26

    ... Rule) that would require certain human drug product labels and biological product labels to have a...: Bar Code Label Requirements (Question 12 Update)'' (75 FR 54347 September 2010; Docket No. FDA-2010-D...'' (76 FR 3821). One of the provisions in the new Executive order is the affirmation of...

  12. Modeling Code Is Helping Cleveland Develop New Products

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Master Builders, Inc., is a 350-person company in Cleveland, Ohio, that develops and markets specialty chemicals for the construction industry. Developing new products involves creating many potential samples and running numerous tests to characterize the samples' performance. Company engineers enlisted NASA's help to replace cumbersome physical testing with computer modeling of the samples' behavior. Since the NASA Lewis Research Center's Structures Division develops mathematical models and associated computation tools to analyze the deformation and failure of composite materials, its researchers began a two-phase effort to modify Lewis' Integrated Composite Analyzer (ICAN) software for Master Builders' use. Phase I has been completed, and Master Builders is pleased with the results. The company is now working to begin implementation of Phase II.

  13. Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Gould, R. K.

    1979-01-01

    Mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon were developed. The following tasks were accomplished: (1) formulation of a model for silicon vapor separation/collection from the developing turbulent flow stream within reactors of the Westinghouse (2) modification of an available general parabolic code to achieve solutions to the governing partial differential equations (boundary layer type) which describe migration of the vapor to the reactor walls, (3) a parametric study using the boundary layer code to optimize the performance characteristics of the Westinghouse reactor, (4) calculations relating to the collection efficiency of the new AeroChem reactor, and (5) final testing of the modified LAPP code for use as a method of predicting Si(1) droplet sizes in these reactors.

  14. Annual Coded Wire Tag Program; Washington Missing Production Groups, 2000 Annual Report.

    SciTech Connect

    Dammers, Wolf; Mills, Robin D.

    2002-02-01

    The Bonneville Power Administration (BPA) funds the ''Annual Coded-wire Tag Program - Missing Production Groups for Columbia River Hatcheries'' project. The Washington Department of Fish and Wildlife (WDFW), Oregon Department of Fish and Wildlife (ODFW) and the United States Fish and Wildlife Service (USFWS) all operate salmon and steelhead rearing programs in the Columbia River basin. The intent of the funding is to coded-wire tag at least one production group of each species at each Columbia Basin hatchery to provide a holistic assessment of survival and catch distribution over time and to meet various measures of the Northwest Power Planning Council's (NWPPC) Fish and Wildlife Program. The WDFW project has three main objectives: (1) coded-wire tag at least one production group of each species at each Columbia Basin hatchery to enable evaluation of survival and catch distribution over time, (2) recover coded-wire tags from the snouts of fish tagged under objective 1 and estimate survival, contribution, and stray rates for each group, and (3) report the findings under objective 2 for all broods of chinook, and coho released from WDFW Columbia Basin hatcheries. Objective 1 for FY-00 was met with few modifications to the original FY-00 proposal. Under Objective 2, snouts containing coded-wire tags that were recovered during FY-00 were decoded. Under Objective 3, this report summarizes available recovery information through 2000 and includes detailed information for brood years 1989 to 1994 for chinook and 1995 to 1997 for coho.

  15. Annual Coded-Wire Tag Program : Washington : Missing Production Groups Annual Report for 2000.

    SciTech Connect

    Mills, Robin D.

    2002-02-01

    The Bonneville Power Administration (BPA) funds the 'Annual Coded-wire Tag Program - Missing Production Groups for Columbia River Hatcheries' project. The Washington Department of Fish and Wildlife (WDFW), Oregon Department of Fish and Wildlife (ODFW) and the United States Fish and Wildlife Service (USFWS) all operate salmon and steelhead rearing programs in the Columbia River basin. The intent of the funding is to coded-wire tag at least one production group of each species at each Columbia Basin hatchery to provide a holistic assessment of survival and catch distribution over time and to meet various measures of the Northwest Power Planning Council's (NWPPC) Fish and Wildlife Program. The WDFW project has three main objectives: (1) coded-wire tag at least one production group of each species at each Columbia Basin hatchery to enable evaluation of survival and catch distribution over time, (2) recover coded-wire tags from the snouts of fish tagged under objective 1 and estimate survival, contribution, and stray rates for each group, and (3) report the findings under objective 2 for all broods of chinook, and coho released from WDFW Columbia Basin hatcheries. Objective 1 for FY-00 was met with few modifications to the original FY-00 proposal. Under Objective 2, snouts containing coded-wire tags that were recovered during FY-00 were decoded. Under Objective 3, this report summarizes available recovery information through 2000 and includes detailed information for brood years 1989 to 1994 for chinook and 1995 to 1997 for coho.

  16. Material report in support to RCC-MRX code 2010 stainless steel parts and products

    SciTech Connect

    Ancelet, Olivier; Lebarbe, Thierry; Dubiez-Le Goff, Sophie; Bonne, Dominique; Gelineau, Odile

    2012-07-01

    This paper presents the Material Report dedicated to stainless steels parts and products issued by AFCEN (Association Francaise pour les regles de Conception et de Construction des Materiels des Chaudieres Electro-Nucleaires) in support to RCC-MRx 2010 Code. The RCC-MRx Code is the result of the merger of the RCC-MX 2008, developed in the context of the research reactor Jules Horowitz Reactor project, in the RCC-MR 2007, which set up rules applicable to the design of components operating at high temperature and to the Vacuum Vessel of ITER (a presentation of RCC-MRx 2010 Code is the subject of another paper proposed in this Congress; it explains in particular the status of this Code). This Material Report is part of a set of Criteria of RCC-MRx (this set of Criteria is under construction). The Criteria aim at explaining the design and construction rules of the Code. They cover analyses rules as well as part procurement, welding, methods of tests and examination and fabrication rules. The Material Report particularly provides justifications and explanations on requirements and features dealing with parts and products proposed in the Code. The Material Report contains the following information: Introduction of the grade(s): codes and standards and Reference Procurement Specifications covering parts and products, applications and experience gained, - Physical properties, - Mechanical properties used for design calculations (base metal and welds): basic mechanical properties, creep mechanical properties, irradiated mechanical properties, - Fabrication: experience gained, metallurgy, - Welding: weldability, experience gained during welding and repair procedure qualifications, - Non-destructive examination, - In-service behaviour. In the article, examples of data supplied in the Material Report dedicated to stainless steels will be exposed. (authors)

  17. 50 CFR Table 1a to Part 679 - Delivery Condition* and Product Codes

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Delivery Condition* and Product Codes 1a Table 1a to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE...

  18. 50 CFR Table 1c to Part 679 - Product Type Codes

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 50 Wildlife and Fisheries 13 2012-10-01 2012-10-01 false Product Type Codes 1c Table 1c to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt....

  19. 50 CFR Table 1a to Part 679 - Delivery Condition* and Product Codes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Delivery Condition* and Product Codes 1a Table 1a to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE...

  20. 50 CFR Table 1a to Part 679 - Delivery Condition* and Product Codes

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 50 Wildlife and Fisheries 13 2013-10-01 2013-10-01 false Delivery Condition* and Product Codes 1a Table 1a to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt. 679, Table 1a Table 1a...

  1. 50 CFR Table 1c to Part 679 - Product Type Codes

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 50 Wildlife and Fisheries 11 2011-10-01 2011-10-01 false Product Type Codes 1c Table 1c to Part 679 Wildlife and Fisheries FISHERY CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED) FISHERIES OF THE EXCLUSIVE ECONOMIC ZONE OFF ALASKA Pt....

  2. Cues, Codes, Complexity, and Confusion: Lessons from Complexity regarding Productivity and Resilience

    ERIC Educational Resources Information Center

    Lissack, Michael R.

    2007-01-01

    The very notion of productivity improvement involves measurement against a context. The success of computers and other quantitative approaches during the past half century has led to an ideational context wherein transmitters of information often assume that the content of their message is like code--ascertainable to the recipient by means of a…

  3. 21 CFR 20.115 - Product codes for manufacturing or sales dates.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Product codes for manufacturing or sales dates. 20... for manufacturing or sales dates. Data or information in Food and Drug Administration files which provide a means for deciphering or decoding a manufacturing date or sales date or use date contained...

  4. 21 CFR 20.115 - Product codes for manufacturing or sales dates.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Product codes for manufacturing or sales dates. 20... for manufacturing or sales dates. Data or information in Food and Drug Administration files which provide a means for deciphering or decoding a manufacturing date or sales date or use date contained...

  5. 21 CFR 20.115 - Product codes for manufacturing or sales dates.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Product codes for manufacturing or sales dates. 20... for manufacturing or sales dates. Data or information in Food and Drug Administration files which provide a means for deciphering or decoding a manufacturing date or sales date or use date contained...

  6. 21 CFR 20.115 - Product codes for manufacturing or sales dates.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 1 2014-04-01 2014-04-01 false Product codes for manufacturing or sales dates. 20... for manufacturing or sales dates. Data or information in Food and Drug Administration files which provide a means for deciphering or decoding a manufacturing date or sales date or use date contained...

  7. 21 CFR 20.115 - Product codes for manufacturing or sales dates.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Product codes for manufacturing or sales dates. 20... for manufacturing or sales dates. Data or information in Food and Drug Administration files which provide a means for deciphering or decoding a manufacturing date or sales date or use date contained...

  8. "ON ALGEBRAIC DECODING OF Q-ARY REED-MULLER AND PRODUCT REED-SOLOMON CODES"

    SciTech Connect

    SANTHI, NANDAKISHORE

    2007-01-22

    We consider a list decoding algorithm recently proposed by Pellikaan-Wu for q-ary Reed-Muller codes RM{sub q}({ell}, m, n) of length n {le} q{sup m} when {ell} {le} q. A simple and easily accessible correctness proof is given which shows that this algorithm achieves a relative error-correction radius of {tau} {le} (1-{radical}{ell}q{sup m-1}/n). This is an improvement over the proof using one-point Algebraic-Geometric decoding method given in. The described algorithm can be adapted to decode product Reed-Solomon codes. We then propose a new low complexity recursive aJgebraic decoding algorithm for product Reed-Solomon codes and Reed-Muller codes. This algorithm achieves a relative error correction radius of {tau} {le} {Pi}{sub i=1}{sup m} (1 - {radical}k{sub i}/q). This algorithm is then proved to outperform the Pellikaan-Wu algorithm in both complexity and error correction radius over a wide range of code rates.

  9. Production Level CFD Code Acceleration for Hybrid Many-Core Architectures

    NASA Technical Reports Server (NTRS)

    Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.

    2012-01-01

    In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.

  10. Advances and future needs in particle production and transport code developments

    SciTech Connect

    Mokhov, N.V.; /Fermilab

    2009-12-01

    The next generation of accelerators and ever expanding needs of existing accelerators demand new developments and additions to Monte-Carlo codes, with an emphasis on enhanced modeling of elementary particle and heavy-ion interactions and transport. Challenges arise from extremely high beam energies and beam power, increasing complexity of accelerators and experimental setups, as well as design, engineering and performance constraints. All these put unprecedented requirements on the accuracy of particle production predictions, the capability and reliability of the codes used in planning new accelerator facilities and experiments, the design of machine, target and collimation systems, detectors and radiation shielding and minimization of their impact on environment. Recent advances in widely-used general-purpose all-particle codes are described for the most critical modules such as particle production event generators, elementary particle and heavy ion transport in an energy range which spans up to 17 decades, nuclide inventory and macroscopic impact on materials, and dealing with complex geometry of accelerator and detector structures. Future requirements for developing physics models and Monte-Carlo codes are discussed.

  11. The POPOP4 library and codes for preparing secondary gamma-ray production cross sections

    NASA Technical Reports Server (NTRS)

    Ford, W. E., III

    1972-01-01

    The POPOP4 code for converting secondary gamma ray yield data to multigroup secondary gamma ray production cross sections and the POPOP4 library of secondary gamma ray yield data are described. Recent results of the testing of uranium and iron data sets from the POPOP4 library are given. The data sets were tested by comparing calculated secondary gamma ray pulse height spectra measured at the ORNL TSR-II reactor.

  12. Development of a model and computer code to describe solar grade silicon production processes

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Gould, R. K.

    1979-01-01

    Mathematical models, and computer codes based on these models were developed which allow prediction of the product distribution in chemical reactors in which gaseous silicon compounds are converted to condensed phase silicon. The reactors to be modeled are flow reactors in which silane or one of the halogenated silanes is thermally decomposed or reacted with an alkali metal, H2 or H atoms. Because the product of interest is particulate silicon, processes which must be modeled, in addition to mixing and reaction of gas-phase reactants, include the nucleation and growth of condensed Si via coagulation, condensation, and heterogeneous reaction.

  13. FPTRAN: A Volatile Fission Products and Structural Materials Transport Code for SCDAP/RELAP5

    SciTech Connect

    Honaiser, Eduardo; Anghaie, Samim

    2004-07-01

    The fission products behavior in reactor coolant systems (RCS) is divided in the fission products release from the fuel, transport through the piping system, and the chemistry of the several materials present in a LWR. The transport poses significant difficulty for the implementation, due to the complexity in the treatment of the system of equations generated for the solution, as well as the difficulties in the modeling of certain phenomena. This paper presents the FPTRAN code, which was incorporated to SCDAP/RELAP5, and initially tested satisfactorily. FPTRAN does the calculation of the transport of fission products in RCS, estimating the amount of material being deposited over the pipes, and the amount released to the containment, once a source of released material (fission products and structural materials) to the piping system is provided. (authors)

  14. Development of a Model and Computer Code to Describe Solar Grade Silicon Production Processes

    NASA Technical Reports Server (NTRS)

    Srivastava, R.; Gould, R. K.

    1979-01-01

    The program aims at developing mathematical models and computer codes based on these models, which allow prediction of the product distribution in chemical reactors for converting gaseous silicon compounds to condensed-phase silicon. The major interest is in collecting silicon as a liquid on the reactor walls and other collection surfaces. Two reactor systems are of major interest, a SiCl4/Na reactor in which Si(l) is collected on the flow tube reactor walls and a reactor in which Si(l) droplets formed by the SiCl4/Na reaction are collected by a jet impingement method. During this quarter the following tasks were accomplished: (1) particle deposition routines were added to the boundary layer code; and (2) Si droplet sizes in SiCl4/Na reactors at temperatures below the dew point of Si are being calculated.

  15. OSCAR-Na: A New Code for Simulating Corrosion Product Contamination in SFR

    NASA Astrophysics Data System (ADS)

    Génin, J.-B.; Brissonneau, L.; Gilardi, T.

    2016-07-01

    A code named OSCAR-Na has been developed to calculate the mass transfer of corrosion products in the primary circuit of sodium fast reactors (SFR). It is based on a solution/precipitation model, including diffusion in the steel (enhanced under irradiation), diffusion through the sodium boundary layer, equilibrium concentration of each element, and velocity of the interface (bulk corrosion or deposition). The code uses a numerical method for solving the diffusion equation in the steel and the complete mass balance in sodium for all elements. Corrosion and deposition rates are mainly determined by the iron equilibrium concentration in sodium and its oxygen-enhanced dissolution rate. All parameters of the model have been assessed from a literature review, but iron solubility had to be adjusted. A simplified primary system description of PHENIX French SFR was able to assess the correct amounts and profiles of contamination on heat exchanger surfaces for the main radionuclides.

  16. Syntactic Alignment and Shared Word Order in Code-Switched Sentence Production: Evidence from Bilingual Monologue and Dialogue

    ERIC Educational Resources Information Center

    Kootstra, Gerrit Jan; van Hell, Janet G.; Dijkstra, Ton

    2010-01-01

    In four experiments, we investigated the role of shared word order and alignment with a dialogue partner in the production of code-switched sentences. In Experiments 1 and 2, Dutch-English bilinguals code-switched in describing pictures while being cued with word orders that are either shared or not shared between Dutch and English. In Experiments…

  17. An optimal unequal error protection scheme with turbo product codes for wavelet compression of ultraspectral sounder data

    NASA Astrophysics Data System (ADS)

    Huang, Bormin; Sriraja, Y.; Ahuja, Alok; Goldberg, Mitchell D.

    2006-08-01

    Most source coding techniques generate bitstream where different regions have unequal influences on data reconstruction. An uncorrected error in a more influential region can cause more error propagation in the reconstructed data. Given a limited bandwidth, unequal error protection (UEP) via channel coding with different code rates for different regions of bitstream may yield much less error contamination than equal error protection (EEP). We propose an optimal UEP scheme that minimizes error contamination after channel and source decoding. We use JPEG2000 for source coding and turbo product code (TPC) for channel coding as an example to demonstrate this technique with ultraspectral sounder data. Wavelet compression yields unequal significance in different wavelet resolutions. In the proposed UEP scheme, the statistics of erroneous pixels after TPC and JPEG2000 decoding are used to determine the optimal channel code rates for each wavelet resolution. The proposed UEP scheme significantly reduces the number of pixel errors when compared to its EEP counterpart. In practice, with a predefined set of implementation parameters (available channel codes, desired code rate, noise level, etc.), the optimal code rate allocation for UEP needs to be determined only once and can be done offline.

  18. 76 FR 53912 - FDA's Public Database of Products With Orphan-Drug Designation: Replacing Non-Informative Code...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... HUMAN SERVICES Food and Drug Administration FDA's Public Database of Products With Orphan-Drug... its public database of products that have received orphan-drug designation. The Orphan Drug Act... received orphan designation were published on our public database with non-informative code names....

  19. Global Transcriptional Dynamics of Diapause Induction in Non-Blood-Fed and Blood-Fed Aedes albopictus

    PubMed Central

    Huang, Xin; Poelchau, Monica F.; Armbruster, Peter A.

    2015-01-01

    Background Aedes albopictus is a vector of increasing public health concern due to its rapid global range expansion and ability to transmit Dengue virus, Chikungunya virus and a wide range of additional arboviruses. Traditional vector control strategies have been largely ineffective against Ae. albopictus and novel approaches are urgently needed. Photoperiodic diapause is a crucial ecological adaptation in a wide range of temperate insects. Therefore, targeting the molecular regulation of photoperiodic diapause or diapause-associated physiological processes could provide the basis of novel approaches to vector control. Methodology/Principal Findings We investigated the global transcriptional profiles of diapause induction in Ae. albopictus by performing paired-end RNA-Seq of biologically replicated libraries. We sequenced RNA from whole bodies of adult females reared under diapause-inducing and non-diapause-inducing photoperiods either with or without a blood meal. We constructed a comprehensive transcriptome assembly that incorporated previous assemblies and represents over 14,000 annotated dipteran gene models. Mapping of sequence reads to the transcriptome identified differential expression of 2,251 genes in response to diapause-inducing short-day photoperiods. In non-blood-fed females, potential regulatory elements of diapause induction were transcriptionally up-regulated, including two of the canonical circadian clock genes, timeless and cryptochrome 1. In blood-fed females, genes in metabolic pathways related to energy production and offspring provisioning were differentially expressed under diapause-inducing conditions, including the oxidative phosphorylation pathway and lipid metabolism genes. Conclusions/Significance This study is the first to utilize powerful RNA-Seq technologies to elucidate the transcriptional basis of diapause induction in any insect. We identified candidate genes and pathways regulating diapause induction, including a conserved set of

  20. Numerical simulation of X-ray fluorescence production using MCNPX code

    NASA Astrophysics Data System (ADS)

    Kim, Kyeong Ja; Park, Junghun Park

    Numerical simulation for the production of X-ray fluorescence by an active X-ray spectrometer was accomplished by MCNPX (Monte Carlo N-particle eXtended) Code. Purpose of this study is to cross check between the simulation result and the actual measurement to validate the numerical simulation for prospective usage for various possible cases of measurements which are not easily accessible in a laboratory environment. This study was initiated as a conjunction of Phase A study for the SELENE-2 science payload proposed in 2011. The active X-ray Spectrometer includes an X-ray spectrometer and a pyroelectric crystal-used X-ray generator. For Phase A study, we used both XRS and XRG available from the commercial company, Amptek Inc. Numerical simulation is important to optimize both instrument design and geometry of measurement to perform the best measurement output of an experiment planned. The main purpose of this study is to understand the production of X-ray fluorescence by an active X-ray spectrometer which could be onboard for future planetary spacecraft. For the numerical simulation, we used the lunar simulant FSJ-1 composition, and the input parameters for X-ray flux and energy distribution were accessed from the information of the X-ray generator, Cool-X. The parameters for geometry setting were defined as the experimental setting used for the actual measurement. It was found that the spectrum of numerical simulation is compared well with the actual measurement at the laboratory setting with respect to the number of elements, peak counts, and energy spectrum. To find optimal distance and geometry settings toward the production of X-ray fluorescence, multiple simulations at various geometry settings are currently under investigation.

  1. Knowledge, Attitudes and Perceptions Among Non-Blood Donor Female Health Care Professionals

    PubMed Central

    Bilal, Muhammad; Haseeb, Abdul; Zahid, Ibrahim; Lashkerwala, Sehan Siraj; Saeeduddin, Fawad; Saad, Muhammad; Arshad, Mohammad Hussham; Moorpani, Manpreet; Khan, Midhat Zafar; Tariq, Ahsan; Habib, Haya; Islam, Tehrema; Advani, Rohan

    2016-01-01

    Introduction: Blood donation is necessary in order to maintain an adequate supply of blood to patients who are suffering from any kind of disease or trauma, which requires them to have blood transfusion. Female non-blood donors are generally low in number. Therefore, this research was carried out to assess the main reasons behind the lack of blood donations made by females, and their knowledge, attitude and perceptions towards voluntary blood donation. Methodology: A cross-sectional study was conducted on 664 female health professionals, who were selected by non-probability convenience sampling from two tertiary care hospitals. A pretested questionnaire was presented to the sample population, and the data was entered and analyzed on SPSS (V17). Results: 94.6 % were aware with the fact that blood is screened for AIDS, Hepatitis B and C before transfusion. Moreover, 83.7% said that they will only donate blood if a family, relative or friend would need it and similarly 83.4% suggested that they would donate blood if blood donation camps are arranged in hospital premises. 81.8 % thought that blood donors can contract Hepatitis B after donation whereas only 29.5% did not blood due already blood loss in menstrual cycle. Conclusion: The participants had adequate knowledge about the benefits of blood donation. The most important reason identified for not donating blood is the lack of facilities within the workplace or lack of approach by responsible authorities. The results of the study may help in minimizing the misconceptions of the participants about blood transfusion, which would increase their contribution towards blood donation. PMID:26573048

  2. Application to MISR Land Products of an RPV Model Inversion Package Using Adjoint and Hessian Codes

    NASA Astrophysics Data System (ADS)

    Lavergne, T.; Kaminski, T.; Pinty, B.; Taberner, M.; Gobron, N.; Verstraete, M. M.; Vossbeck, M.; Widlowski, J.-L.; Giering, R.

    The capability of the non-linear Rahman-Pinty-Verstraete RPV model to 1 accurately fit a large variety of Bidirectional Reflectance Factor BRF fields and 2 return parameter values of interest for land surface applications motivate the development of a computer efficient inversion package The present paper describes such a package based on the 3 and 4 parameter versions of the RPV model This software environment implements the adjoint code generated using automatic differentiation techniques of the cost function This cost function itself balances two main contributions reflecting 1 the a priori knowledge on the model parameter values and 2 BRF uncertainties together with the requirement to minimize the mismatch between the measurements and the RPV simulations The individual weights of these contributions are specified notably via covariance matrices of the uncertainties in the a priori knowledge on the model parameters and the observations This package also reports on the probability density functions of the retrieved model parameter values that thus permit the user to evaluate the a posteriori uncertainties on these retrievals This is achieved by evaluating the Hessian of the cost function at its minimum Results from a variety of tests are shown in order to document and analyze software performance against complex synthetic BRF fields simulated by radiation transfer models as well as against actual MISR-derived surface BRF products

  3. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1998 Annual Report.

    SciTech Connect

    Lewis, Mark A.; Mallette, Christine; Murray, William M.

    1999-03-01

    This annual report is in fulfillment of contract obligations with Bonneville Power Administration which is the funding source for the Oregon Department of Fish and Wildlife's Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule stock fall chinook were caught primarily in British Columbia and Washington ocean, and Columbia Basin fisheries. Up-river bright stock fall chinook contributed primarily to Alaska and British Columbia ocean commercial, Columbia Basin gillnet and other freshwater fisheries. Contribution of Rogue stock fall chinook released in the lower Columbia River occurred primarily in Oregon ocean commercial, Columbia Basin gillnet and other freshwater fisheries. Willamette stock spring chinook contributed primarily to Alaska and British Columbia ocean, and Columbia Basin fisheries. Willamette stock spring chinook released by CEDC contributed to similar ocean fisheries, but had much higher catch in Columbia Basin gillnet fisheries than the same stocks released in the Willamette Basin. Up-river stocks of spring chinook contributed almost exclusively to Columbia Basin fisheries. The up-river stocks of Columbia River summer steelhead contributed almost exclusively to the Columbia Basin gillnet and other freshwater fisheries. Coho ocean fisheries from Washington to California were closed or very limited from 1994 through 1998 (1991 through 1995 broods). This has resulted in a lower percent of catch in Washington, Oregon and California ocean fisheries, and a higher percent of catch in Alaska and British Columbia ocean and Columbia Basin freshwater fisheries. Coho stocks released by ODFW below Bonneville Dam were caught mainly in Oregon and Washington ocean, Columbia Gillnet and other freshwater fisheries. Coho stocks released in the Klaskanine River and Youngs Bay area had similar ocean catch distributions, but a much higher percent catch in gillnet fisheries than the other coho releases. Ocean catch distribution of coho stocks

  4. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1997 Annual Report.

    SciTech Connect

    Lewis, Mark A.; Mallette, Christine; Murray, William M.

    1998-03-01

    This annual report is in fulfillment of contract obligations with Bonneville Power Administration which is the funding source for the Oregon Department of Fish and Wildlife's Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule stock fall chinook were caught primarily in British Columbia and Washington ocean, and Oregon freshwater fisheries. Up-river bright stock fall chinook contributed primarily to Alaska and British Columbia ocean commercial, and Columbia River gillnet and other freshwater fisheries. Contribution of Rogue stock fall chinook released in the lower Columbia River occurred primarily in Oregon ocean commercial and Columbia river gillnet fisheries. Willamette stock spring chinook contributed primarily to Alaska and British Columbia ocean commercial, Oregon freshwater sport and Columbia River gillnet fisheries. Willamette stock spring chinook released by CEDC contributed to similar ocean fisheries, but had much higher catch in gillnet fisheries than the same stocks released in the Willamette system. Up-river stocks of spring chinook contributed almost exclusively to Columbia River sport fisheries and other freshwater recovery areas. The up-river stocks of Columbia River summer steelhead contributed primarily to the Columbia River gillnet and other freshwater fisheries. Coho ocean fisheries from Washington to California were closed or very limited from 1994 through 1997 (1991 through 1994 broods). This has resulted in a greater average percent of catch for other fishery areas. Coho stocks released by ODFW below Bonneville Dam contributed mainly to Oregon and Washington ocean, Columbia Gillnet and other freshwater fisheries. Coho stocks released in the Klaskanine River and Youngs Bay area had similar ocean catch, but much higher contribution to gillnet fisheries than the other coho releases. Coho stocks released above Bonneville Dam had similar contribution to ocean fisheries as other coho releases. However, they contributed

  5. Inter-language interference in VOT production by L2-dominant bilinguals: Asymmetries in phonetic code-switching

    PubMed Central

    Antoniou, Mark; Best, Catherine T.; Tyler, Michael D.; Kroos, Christian

    2011-01-01

    Speech production research has demonstrated that the first language (L1) often interferes with production in bilinguals’ second language (L2), but it has been suggested that bilinguals who are L2-dominant are the most likely to suppress this L1-interference. While prolonged contextual changes in bilinguals’ language use (e.g., stays overseas) are known to result in L1 and L2 phonetic shifts, code-switching provides the unique opportunity of observing the immediate phonetic effects of L1-L2 interaction. We measured the voice onset times (VOTs) of Greek–English bilinguals’ productions of /b, d, p, t/ in initial and medial contexts, first in either a Greek or English unilingual mode, and in a later session when they produced the same target pseudowords as a code-switch from the opposing language. Compared to a unilingual mode, all English stops produced as code-switches from Greek, regardless of context, had more Greek-like VOTs. In contrast, Greek stops showed no shift toward English VOTs, with the exception of medial voiced stops. Under the specifically interlanguage condition of code-switching we have demonstrated a pervasive influence of the L1 even in L2-dominant individuals. PMID:22787285

  6. Development of a model and computer code to describe solar grade silicon production processes

    NASA Technical Reports Server (NTRS)

    Gould, R. K.; Srivastava, R.

    1979-01-01

    Two computer codes were developed for describing flow reactors in which high purity, solar grade silicon is produced via reduction of gaseous silicon halides. The first is the CHEMPART code, an axisymmetric, marching code which treats two phase flows with models describing detailed gas-phase chemical kinetics, particle formation, and particle growth. It can be used to described flow reactors in which reactants, mix, react, and form a particulate phase. Detailed radial gas-phase composition, temperature, velocity, and particle size distribution profiles are computed. Also, deposition of heat, momentum, and mass (either particulate or vapor) on reactor walls is described. The second code is a modified version of the GENMIX boundary layer code which is used to compute rates of heat, momentum, and mass transfer to the reactor walls. This code lacks the detailed chemical kinetics and particle handling features of the CHEMPART code but has the virtue of running much more rapidly than CHEMPART, while treating the phenomena occurring in the boundary layer in more detail.

  7. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1995 Annual Report.

    SciTech Connect

    Garrison, Robert L.; Mallette, Christine; Lewis, Mark A.

    1995-12-01

    Bonneville Power Administration is the funding source for the Oregon Department of Fish and Wildlife`s Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule brood fall chinook were caught primarily in the British Columbia, Washington and northern Oregon ocean commercial fisheries. The up-river bright fall chinook contributed primarily to the Alaska and British Columbia ocean commercial fisheries and the Columbia River gillnet fishery. Contribution of Rogue fall chinook released in the lower Columbia River system occurred primarily in the Oregon ocean commercial and Columbia river gillnet fisheries Willamette spring chinook salmon contributed primarily to the Alaska and British Columbia ocean commercial, Oregon freshwater sport and Columbia River gillnet fisheries. Restricted ocean sport and commercial fisheries limited contribution of the Columbia coho released in the Umatilla River that survived at an average rate of 1.05% and contributed primarily to the Washington, Oregon and California ocean sport and commercial fisheries and the Columbia River gillnet fishery. The 1987 to 1991 brood years of coho released in the Yakima River survived at an average rate of 0.64% and contributed primarily to the Washington, Oregon and California ocean sport and commercial fisheries and the Columbia River gillnet fishery. Survival rates of salmon and steelhead are influenced, not only by factors in the hatchery, disease, density, diet and size and time of release, but also by environmental factors in the river and ocean. These environmental factors are controlled by large scale weather patterns such as El Nino over which man has no influence. Man could have some influence over river flow conditions, but political and economic pressures generally out weigh the biological needs of the fish.

  8. FITPULS: a code for obtaining analytic fits to aggregate fission-product decay-energy spectra. [In FORTRAN

    SciTech Connect

    LaBauve, R.J.; George, D.C.; England, T.R.

    1980-03-01

    The operation and input to the FITPULS code, recently updated to utilize interactive graphics, are described. The code is designed to retrieve data from a library containing aggregate fine-group spectra (150 energy groups) from fission products, collapse the data to few groups (up to 25), and fit the resulting spectra along the cooling time axis with a linear combination of exponential functions. Also given in this report are useful results for aggregate gamma and beta spectra from the decay of fission products released from /sup 235/U irradiated with a pulse (10/sup -4/ s irradiation time) of thermal neutrons. These fits are given in 22 energy groups that are the first 22 groups of the LASL 25-group decay-energy group structure, and the data are expressed both as MeV per fission second and particles per fission second; these pulse functions are readily folded into finite fission histories. 65 figures, 11 tables.

  9. Production version of the extended NASA-Langley vortex lattice FORTRAN computer program. Volume 2: Source code

    NASA Technical Reports Server (NTRS)

    Herbert, H. E.; Lamar, J. E.

    1982-01-01

    The source code for the latest production version, MARK IV, of the NASA-Langley Vortex Lattice Computer Program is presented. All viable subcritical aerodynamic features of previous versions were retained. This version extends the previously documented program capabilities to four planforms, 400 panels, and enables the user to obtain vortex-flow aerodynamics on cambered planforms, flowfield properties off the configuration in attached flow, and planform longitudinal load distributions.

  10. SPARC-90: A code for calculating fission product capture in suppression pools

    SciTech Connect

    Owczarski, P.C.; Burk, K.W. )

    1991-10-01

    This report describes the technical bases and use of two updated versions of a computer code initially developed to serve as a tool for calculating aerosol particle retention in boiling water reactor (BWR) pressure suppression pools during severe accidents, SPARC-87 and SPARC-90. The most recent version is SPARC-90. The initial or prototype version (Owczarski, Postma, and Schreck 1985) was improved to include the following: rigorous treatment of local particle deposition velocities on the surface of oblate spherical bubbles, new correlations for hydrodynamic behavior of bubble swarms, models for aerosol particle growth, both mechanistic and empirical models for vent exit region scrubbing, specific models for hydrodynamics of bubble breakup at various vent types, and models for capture of vapor iodine species. A complete user's guide is provided for SPARC-90 (along with SPARC-87). A code description, code operating instructions, partial code listing, examples of the use of SPARC-90, and summaries of experimental data comparison studies also support the use of SPARC-90. 29 refs., 4 figs., 11 tabs.

  11. A Multigroup Reaction Cross-Section Collapsing Code and Library of 154-Group Fission-Product Cross Sections.

    Energy Science and Technology Software Center (ESTSC)

    1983-03-23

    Version 01/02 The code reads multigroup cross sections from a compatible data file and collapses user-selected reaction cross sections to any few-group structure using one of a variety of user neutron flux spectrum options given below: Option Flux description 1 Built-in function including Maxwellian, fission, fusion and slowing-down regions and requiring user-specified parameters and energy-region boundaries. 2 Set of log-log flux-energy interpolation points read from input cross-section data file. 3 Set of log-log flux-energy interpolationmore » points read from user-supplied card input. 4 - 6 Histogram flux values read from user-supplied card input in arbitrary group structure in units of flux-per unit-energy, flux-per-unit lethargy, or integral group flux. LAFPX-E may be used to collapse any set of multigroup reaction cross sections furnished in the required format. However, the code was developed for, and is furnished with, a library of 154-group fission-product cross sections processed from ENDF/B-IV with a typical light water reactor (LWR) flux spectrum and temperature. Four-group radiative capture cross sections produced for LWR calculations are tabulated in the code documentation and are incorporated in the EPRI-CINDER data library, RSIC Code Package CCC-309.« less

  12. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1999 Annual Report.

    SciTech Connect

    Lewis, Mark A.; Mallette, Christine; Murray, William M.

    2000-03-01

    This annual report is in fulfillment of contract obligations with Bonneville Power Administration which is the funding source for the Oregon Department of Fish and Wildlife's Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule stock fall chinook were caught primarily in British Columbia and Washington ocean, and Columbia Basin fisheries. Up-river bright stock fall chinook contributed primarily to Alaska and British Columbia ocean commercial, Columbia Basin gillnet and other freshwater fisheries. Contribution of Rogue stock fall chinook released in the lower Columbia River occurred primarily in Oregon ocean commercial, Columbia Basin gillnet and other freshwater fisheries. Willamette stock spring chinook contributed primarily to Alaska and British Columbia ocean, and Columbia Basin non-gillnet fisheries. Willamette stock spring chinook released by CEDC contributed to similar ocean fisheries, but had much higher catch in Columbia Basin gillnet fisheries than the same stocks released in the Willamette Basin. Up-river stocks of spring chinook contributed almost exclusively to Columbia Basin fisheries. The up-river stocks of Columbia River summer steelhead contributed almost exclusively to the Columbia Basin gillnet and other freshwater fisheries. Coho ocean fisheries from Washington to California were closed or very limited from 1994 through 1999 (1991 through 1996 broods). This has resulted in a lower percent of catch in Washington, Oregon and California ocean fisheries, and a higher percent of catch in Alaska and British Columbia ocean and Columbia Basin freshwater fisheries. Coho stocks released by ODFW below Bonneville Dam were caught mainly in Oregon, Washington, and British Columbia ocean, Columbia Gillnet and other freshwater fisheries. Coho stocks released in the Klaskanine River and Youngs Bay area had similar ocean catch distributions, but a much higher percent catch in gillnet fisheries than the other coho releases. Ocean catch

  13. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1996 Annual Report.

    SciTech Connect

    Lewis, Mark A.; Mallette, Christine; Murray, William M.

    1998-03-01

    This annual report is in fulfillment of contract obligations with Bonneville Power Administration which is the funding source for the Oregon Department of Fish and Wildlife's Annual Coded Wire Tag Program - Oregon Missing Production Groups Project. Tule stock fall chinook were caught primarily in British Columbia and Washington ocean, and Oregon freshwater fisheries. Up-river bright stock fall chinook contributed primarily to Alaska and British Columbia ocean commercial, and Columbia River gillnet and other freshwater fisheries. Contribution of Rogue stock fall chinook released in the lower Columbia River occurred primarily in Oregon ocean commercial and Columbia river gillnet fisheries. Willamette stock spring chinook contributed primarily to Alaska and British Columbia ocean commercial, Oregon freshwater sport and Columbia River gillnet fisheries. Willamette stock spring chinook released by CEDC contributed to similar fisheries as the same stocks released in the Willamette system. Up-river stocks of spring chinook contributed almost exclusively to Columbia River sport fisheries and other freshwater recovery areas. The up-river stocks of Columbia River summer steelhead contributed primarily to the Columbia River gillnet and other freshwater fisheries. Coho ocean fisheries from Washington to California were closed or very limited in 1994 and 1995 (1991 and 1992 broods). This has resulted in a greater average percent of catch for other fishery areas. Coho stocks released by ODFW below Bonneville Dam contributed mainly to Oregon and Washington ocean, Columbia Gillnet and other freshwater fisheries. Coho stocks released in the Klaskanine River and Youngs Bay area had much higher contribution to gillnet fisheries than the other coho releases. Coho stocks released above Bonneville Dam contributed to the same fisheries as those released below Bonneville Dam. Survival rates of salmon and steelhead are influenced, not only by factors in the hatchery (disease, density, diet

  14. kspectrum: an open-source code for high-resolution molecular absorption spectra production

    NASA Astrophysics Data System (ADS)

    Eymet, V.; Coustet, C.; Piaud, B.

    2016-01-01

    We present the kspectrum, scientific code that produces high-resolution synthetic absorption spectra from public molecular transition parameters databases. This code was originally required by the atmospheric and astrophysics communities, and its evolution is now driven by new scientific projects among the user community. Since it was designed without any optimization that would be specific to any particular application field, its use could also be extended to other domains. kspectrum produces spectral data that can subsequently be used either for high-resolution radiative transfer simulations, or for producing statistic spectral model parameters using additional tools. This is a open project that aims at providing an up-to-date tool that takes advantage of modern computational hardware and recent parallelization libraries. It is currently provided by Méso-Star (http://www.meso-star.com) under the CeCILL license, and benefits from regular updates and improvements.

  15. Annual Coded Wire Tag Program; Oregon Missing Production Groups, 1994 Annual Report.

    SciTech Connect

    Garrison, Robert L.; Isaac, Dennis L.; Lewis, Mark A.

    1994-12-01

    The goal of this program is to develop the ability to estimate hatchery production survival values and evaluate effectiveness of Oregon hatcheries. To accomplish this goal. We are tagging missing production groups within hatcheries to assure each production group is identifiable to allow future evaluation upon recovery of tag data.

  16. Assessment of the production of medical isotopes using the Monte Carlo code FLUKA: Simulations against experimental measurements

    NASA Astrophysics Data System (ADS)

    Infantino, Angelo; Oehlke, Elisabeth; Mostacci, Domiziano; Schaffer, Paul; Trinczek, Michael; Hoehr, Cornelia

    2016-01-01

    The Monte Carlo code FLUKA is used to simulate the production of a number of positron emitting radionuclides, 18F, 13N, 94Tc, 44Sc, 68Ga, 86Y, 89Zr, 52Mn, 61Cu and 55Co, on a small medical cyclotron with a proton beam energy of 13 MeV. Experimental data collected at the TR13 cyclotron at TRIUMF agree within a factor of 0.6 ± 0.4 with the directly simulated data, except for the production of 55Co, where the simulation underestimates the experiment by a factor of 3.4 ± 0.4. The experimental data also agree within a factor of 0.8 ± 0.6 with the convolution of simulated proton fluence and cross sections from literature. Overall, this confirms the applicability of FLUKA to simulate radionuclide production at 13 MeV proton beam energy.

  17. Automatic detection of blood versus non-blood regions on intravascular ultrasound (IVUS) images using wavelet packet signatures

    NASA Astrophysics Data System (ADS)

    Katouzian, Amin; Baseri, Babak; Konofagou, Elisa E.; Laine, Andrew F.

    2008-03-01

    Intravascular ultrasound (IVUS) has been proven a reliable imaging modality that is widely employed in cardiac interventional procedures. It can provide morphologic as well as pathologic information on the occluded plaques in the coronary arteries. In this paper, we present a new technique using wavelet packet analysis that differentiates between blood and non-blood regions on the IVUS images. We utilized the multi-channel texture segmentation algorithm based on the discrete wavelet packet frames (DWPF). A k-mean clustering algorithm was deployed to partition the extracted textural features into blood and non-blood in an unsupervised fashion. Finally, the geometric and statistical information of the segmented regions was used to estimate the closest set of pixels to the lumen border and a spline curve was fitted to the set. The presented algorithm may be helpful in delineating the lumen border automatically and more reliably prior to the process of plaque characterization, especially with 40 MHz transducers, where appearance of the red blood cells renders the border detection more challenging, even manually. Experimental results are shown and they are quantitatively compared with manually traced borders by an expert. It is concluded that our two dimensional (2-D) algorithm, which is independent of the cardiac and catheter motions performs well in both in-vivo and in-vitro cases.

  18. Performance of RELAP/SCDAPSIM Code on Fission Products Transport Prediction

    SciTech Connect

    Honaiser, Eduardo

    2006-07-01

    Fission product transport in the piping system of primary circuits is an important area of study in field of the severe accidents. Fission product transport comprises all phenomenon occurring from the nuclear core to the containment release site. Once released in the flow channels, fission products can condense on the piping walls, nucleate aerosols, which can agglomerate and/or deposit on the piping walls. The phenomenology occurs in a steam-hydrogen convective environment. A model (FPTRAN) was developed for the program RELAP/SCDAPSIM that calculates all phenomenon related to the fission product transport through the piping system. The model solves a set of differential equations. The coefficients in these equations represent the processes at which several states change among them. The processes considered were vapor adsorption and condensation on the piping walls, aerosol formation and growth (condensation and agglomeration), and aerosol deposition. The model also controls the aerosol particle size distribution. The PHEBUS experiments compose the most complete experimental program ever conducted for the understanding of fission product behavior in Reactor Cooling System and containment. It employs a reactor to generate fission products, which are transported through a scaled piping system simulating the primary circuit of a pressurized water reactor (PWR). Along the piping system, several instruments are installed to measure the amount of fission products deposited and their states. This paper describes the modeling of the experiment Phebus FPT-01 using RELAP/SCDAPSIM and compares simulation and experimental results to assess the performance of the FPTRAN module on the fission products transport prediction. These results can be considered satisfactory, except for iodine. This inconsistency of iodine is probably due to an incorrect chemical form assumed for iodine. (author)

  19. Numerical simulation of gamma ray and neutron production on lunar surface using MCNPX code

    NASA Astrophysics Data System (ADS)

    Kim, Kyeong Ja

    The production of gamma ray and neutron on various planetary surfaces has been investigated for a few decades by remote sensing techniques. The production of these radiations is due to the nuclear reactions between incoming cosmic rays (i.e. galactic cosmic rays and solar cosmic rays) and planetary materials. In the case of the Moon, not like Mars, returned samples from the previous lunar missions provide us much more realistic parameters in understanding of lunar surface composition although it varies a lot. These constraints provide us restricted settings of parameters for model calculations toward the understanding radiation environment. Recent SELENE (KAGUYA) Gamma Ray spectrometer provides elemental lunar surface maps using major important gamma ray lines, such as Fe, Si, Al, U, Th, Ti, etc toward understanding of the evolution of the lunar surfaces. The effect of size or sub-layered soil deposit in the production of gamma rays and neutrons can be effectively understood using model calculations. Our Monte Carlo N-Particle eXtended (MCNPX) generated provides the gamma ray and neutron productions for various soil settings. Our study demonstrates the gamma ray and neutron production on lunar surfaces of various lunar surface soil settings using numerical simulation of MCNPX.

  20. Spanish and English Word-Initial Voiceless Stop Production in Code-Switched vs. Monolingual Structures

    ERIC Educational Resources Information Center

    Gonzalez Lopez, Veronica

    2012-01-01

    The present study examines the production outcomes of late second language (L2) learners in order to determine if the mechanisms that allow the creation of phonetic categories remains available during the lifespan, as the Speech Language Model (SLM) claims. In addition, the study focuses on the type of interaction that exists between the first…

  1. 50 CFR Table 1a to Part 679 - Delivery Condition and Product Codes

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... (includes offsite production).DO NOT RECORD ON PTR. 42 Bones (if meal, report as 32) (ancillary only). 39... body behind head and in front of tail. 21 Fish meal. Meal from whole fish or fish parts; includes bone..., Western cut.Head removed just in front of the collar bone, and viscera removed. 07 Headed and...

  2. Visualizing how Seismic Waves Propagate Across Seismic Arrays using the IRIS DMS Ground Motion Visualization (GMV) Products and Codes

    NASA Astrophysics Data System (ADS)

    Taber, J.; Bahavar, M.; Bravo, T. K.; Butler, R. F.; Kilb, D. L.; Trabant, C.; Woodward, R.; Ammon, C. J.

    2011-12-01

    Data from dense seismic arrays can be used to visualize the propagation of seismic waves, resulting in animations effective for teaching both general and advanced audiences. One of the first visualizations of this type was developed using Objective C code and EarthScope/USArray data, which was then modified and ported to the Matlab platform and has now been standardized and automated as an IRIS Data Management System (IRIS-DMS) data product. These iterative code developments and improvements were completed by C. Ammon, R. Woodward and M. Bahavar, respectively. Currently, an automated script creates Ground Motion Visualizations (GMVs) for all global earthquakes over magnitude 6 recorded by EarthScope's USArray Transportable Array (USArray TA) network. The USArray TA network is a rolling array of 400 broadband stations deployed on a uniform 70-km grid. These near real-time GMV visualizations are typically available for download within 4 hours or less of their occurrence (see: www.iris.edu/dms/products/usarraygmv/). The IRIS-DMS group has recently added a feature that allows users to highlight key elements within the GMVs, by providing an online tool for creating customized GMVs. This new interface allows users to select the stations, channels, and time window of interest, adjust the mapped areal extent of the view, and specify high and low pass filters. An online tutorial available from the IRIS Education and Public Outreach (IRIS-EPO) website, listed below, steps through a teaching sequence that can be used to explain the basic features of the GMVs. For example, they can be used to demonstrate simple concepts such as relative P, S and surface wave velocities and corresponding wavelengths for middle-school students, or more advanced concepts such as the influence of focal mechanism on waveforms, or how seismic waves converge at an earthquake's antipode. For those who desire a greater level of customization, including the ability to use the GMV framework with data

  3. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  4. LINE: a code which simulates spectral line shapes for fusion reaction products generated by various speed distributions

    SciTech Connect

    Slaughter, D.

    1985-03-01

    A computer code is described which estimates the energy spectrum or ''line-shape'' for the charged particles and ..gamma..-rays produced by the fusion of low-z ions in a hot plasma. The simulation has several ''built-in'' ion velocity distributions characteristic of heated plasmas and it also accepts arbitrary speed and angular distributions although they must all be symmetric about the z-axis. An energy spectrum of one of the reaction products (ion, neutron, or ..gamma..-ray) is calculated at one angle with respect to the symmetry axis. The results are shown in tabular form, they are plotted graphically, and the moments of the spectrum to order ten are calculated both with respect to the origin and with respect to the mean.

  5. Detonation-product behavior at large expansion: the underwater detonation of nitromethane. [KOEOP code

    SciTech Connect

    Helm, F.H. Jr.; Chambers, E.S.; Lee, E.; Finger, M.; McGuire, R.R.; Mahler, J.P.; Cheung, H.; Cramer, J.L.

    1980-12-01

    The expanding product gases of the explosive used in rock fracturing, cratering, and air or underwater explosions do work on the surroundings even at relatively low pressures. To characterize explosives for these applications it is necessary to obtain an almost complete expansion history. An underwater test which was used to measure the shock-wave travel and gas bubble resulting from the detonation of 2 kg of nitromethane is described. The bubble expansion was photographically measured to a volume of about 80 times the initial volume of the shell. These experimental measurements compared well with those calculated by a one-dimensional hydrodynamic program, KOELAS. KOELAS also provided data to assist in the calculation of the apparent position of the shell containing the explosive, using refractive index gradients and the bubble position as a function of time. A method for estimating optical ray paths penetrating the expanding shock front and a modification to a streaking camera that permits stable operation at low rotor speeds are described. The expansion exhibited a minimum pressure of 2 MPa (20 bars), an amount in the range of rock blasting applications, underwater explosions, and rock fracturing or cratering.

  6. FEWZ 2.0: A code for hadronic Z production at next-to-next-to-leading order

    NASA Astrophysics Data System (ADS)

    Gavin, Ryan; Li, Ye; Petriello, Frank; Quackenbush, Seth

    2011-11-01

    We introduce an improved version of the simulation code FEWZ ( Fully Exclusive W and Z Production) for hadron collider production of lepton pairs through the Drell-Yan process at next-to-next-to-leading order (NNLO) in the strong coupling constant. The program is fully differential in the phase space of leptons and additional hadronic radiation. The new version offers users significantly more options for customization. FEWZ now bins multiple, user-selectable histograms during a single run, and produces parton distribution function (PDF) errors automatically. It also features a significantly improved integration routine, and can take advantage of multiple processor cores locally or on the Condor distributed computing system. We illustrate the new features of FEWZ by presenting numerous phenomenological results for LHC physics. We compare NNLO QCD with initial ATLAS and CMS results, and discuss in detail the effects of detector acceptance on the measurement of angular quantities associated with Z-boson production. We address the issue of technical precision in the presence of severe phase-space cuts. Program summaryProgram title: FEWZ Catalogue identifier: AEJP_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJP_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 6 280 771 No. of bytes in distributed program, including test data, etc.: 173 027 645 Distribution format: tar.gz Programming language: Fortran 77, C++, Python Computer: Mac, PC Operating system: Mac OSX, Unix/Linux Has the code been vectorized or parallelized?: Yes. User-selectable, 1 to 219 RAM: 200 Mbytes for common parton distribution functions Classification: 11.1 External routines: CUBA numerical integration library, numerous parton distribution sets (see text); these are provided with the code

  7. Accuracy of Single Frequency GPS Observations Processing In Near Real-time With Use of Code Predicted Products

    NASA Astrophysics Data System (ADS)

    Wielgosz, P. A.

    In this year, the system of active geodetic GPS permanent stations is going to be estab- lished in Poland. This system should provide GPS observations for a wide spectrum of users, especially it will be a great opportunity for surveyors. Many of surveyors still use cheaper, single frequency receivers. This paper focuses on processing of single frequency GPS observations only. During processing of such observations the iono- sphere plays an important role, so we concentrated on the influence of the ionosphere on the positional coordinates. Twenty consecutive days of GPS data from 2001 year were processed to analyze the accuracy of a derived three-dimensional relative vec- tor position between GPS stations. Observations from two Polish EPN/IGS stations: BOGO and JOZE were used. In addition to, a new test station - IGIK was created. In this paper, the results of single frequency GPS observations processing in near real- time are presented. Baselines of 15, 27 and 42 kilometers and sessions of 1, 2, 3, 4, and 6 hours long were processed. While processing we used CODE (Centre for Orbit De- termination in Europe, Bern, Switzerland) predicted products: orbits and ionosphere info. These products are available in real-time and enable near real-time processing. Software Bernese v. 4.2 for Linux and BPE (Bernese Processing Engine) mode were used. These results are shown with a reference to dual frequency weekly solution (the best solution). Obtained GPS positional time and GPS baseline length dependency accuracy is presented for single frequency GPS observations.

  8. [Changes in intraoperative and postoperative colloid oncotic pressure after open heart surgery especially in relation to non-blood priming or blood priming].

    PubMed

    Kanazawa, M; Fujimura, Y; Kawamura, T; Takaki, Y; Okada, H; Nishi, K; Tsuboi, H; Esato, K

    1995-01-01

    In order to evaluate the colloid oncotic pressure (COP) is useful index of hemodynamics and respiratory recovery after open heart surgery, cardiac index (CI), pulmonary capillary wedge pressure (PCWP), (A-a) Do2 and COP were measured in 34 patients during 48 hours after the cardiopulmonary bypass (CPB). The patients were divided into non-blood priming group: 11 patients and blood priming group (23 patients). In addition, blood priming group divided into two groups, one with dopamine (more than 15 micrograms/kg/min), epinephrine or intraaortic balloon pumping (severe blood subgroup, n = 6) and the other without these treatments after open heart surgery (slight blood subgroup n = 17). The COP levels in the non-blood priming group were significantly higher than those in the blood priming group from aortic cross-clamp to 10 minutes after aortic declamping (p < 0.01). From 1 to 48 hours after CPB, COP in the non-blood priming group and slight blood subgroups was significantly higher than severe blood subgroups (p < 0.05). CI and COP-PCWP levels were significantly higher in the non-blood priming group and slight blood subgroups than those in the severe blood group (p < 0.05). It is concluded that COP is useful index of hemodynamics and respiratory recovery after open heart surgery and our priming system without blood is effective in order to eliminate the blood transfusion. PMID:7869637

  9. KUGEL: a thermal, hydraulic, fuel performance, and gaseous fission product release code for pebble bed reactor core analysis

    SciTech Connect

    Shamasundar, B.I.; Fehrenbach, M.E.

    1981-05-01

    The KUGEL computer code is designed to perform thermal/hydraulic analysis and coated-fuel particle performance calculations for axisymmetric pebble bed reactor (PBR) cores. This computer code was developed as part of a Department of Energy (DOE)-funded study designed to verify the published core performance data on PBRs. The KUGEL code is designed to interface directly with the 2DB code, a two-dimensional neutron diffusion code, to obtain distributions of thermal power, fission rate, fuel burnup, and fast neutron fluence, which are needed for thermal/hydraulic and fuel performance calculations. The code is variably dimensioned so that problem size can be easily varied. An interpolation routine allows variable mesh size to be used between the 2DB output and the two-dimensional thermal/hydraulic calculations.

  10. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships. PMID:15768716

  11. Production of secondary particles and nuclei in cosmic rays collisions with the interstellar gas using the FLUKA code

    NASA Astrophysics Data System (ADS)

    Mazziotta, M. N.; Cerutti, F.; Ferrari, A.; Gaggero, D.; Loparco, F.; Sala, P. R.

    2016-08-01

    The measured fluxes of secondary particles produced by the interactions of Cosmic Rays (CRs) with the astronomical environment play a crucial role in understanding the physics of CR transport. In this work we present a comprehensive calculation of the secondary hadron, lepton, gamma-ray and neutrino yields produced by the inelastic interactions between several species of stable or long-lived cosmic rays projectiles (p, D, T, 3He, 4He, 6Li, 7Li, 9Be, 10Be, 10B, 11B, 12C, 13C, 14C, 14N, 15N, 16O, 17O, 18O, 20Ne, 24Mg and 28Si) and different target gas nuclei (p, 4He, 12C, 14N, 16O, 20Ne, 24Mg, 28Si and 40Ar). The yields are calculated using FLUKA, a simulation package designed to compute the energy distributions of secondary products with large accuracy in a wide energy range. The present results provide, for the first time, a complete and self-consistent set of all the relevant inclusive cross sections regarding the whole spectrum of secondary products in nuclear collisions. We cover, for the projectiles, a kinetic energy range extending from 0.1 GeV/n up to 100 TeV/n in the lab frame. In order to show the importance of our results for multi-messenger studies about the physics of CR propagation, we evaluate the propagated spectra of Galactic secondary nuclei, leptons, and gamma rays produced by the interactions of CRs with the interstellar gas, exploiting the numerical codes DRAGON and GammaSky. We show that, adopting our cross section database, we are able to provide a good fit of a complete sample of CR observables, including: leptonic and hadronic spectra measured at Earth, the local interstellar spectra measured by Voyager, and the gamma-ray emissivities from Fermi-LAT collaboration. We also show a set of gamma-ray and neutrino full-sky maps and spectra.

  12. Development of Tritium Permeation Analysis Code and Tritium Transport in a High Temperature Gas-Cooled Reactor Coupled with Hydrogen Production System

    SciTech Connect

    Chang H. Oh; Eung S. Kim; Mike Patterson

    2010-06-01

    Abstract – A tritium permeation analyses code (TPAC) was developed by Idaho National Laboratory for the purpose of analyzing tritium distributions in very high temperature reactor (VHTR) systems, including integrated hydrogen production systems. A MATLAB SIMULINK software package was used in developing the code. The TPAC is based on the mass balance equations of tritium-containing species and various forms of hydrogen coupled with a variety of tritium sources, sinks, and permeation models. In the TPAC, ternary fission and neutron reactions with 6Li, 7Li 10B, and 3He were taken into considerations as tritium sources. Purification and leakage models were implemented as main tritium sinks. Permeation of tritium and H2 through pipes, vessels, and heat exchangers were considered as main tritium transport paths. In addition, electroyzer and isotope exchange models were developed for analyzing hydrogen production systems, including high temperature electrolysis and sulfur-iodine processes.

  13. 21 CFR 607.35 - Notification of registrant; blood product establishment registration number and NDC Labeler Code.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Notification of registrant; blood product... PRODUCT LISTING FOR MANUFACTURERS OF HUMAN BLOOD AND BLOOD PRODUCTS Procedures for Domestic Blood Product Establishments § 607.35 Notification of registrant; blood product establishment registration number and...

  14. 21 CFR 607.35 - Notification of registrant; blood product establishment registration number and NDC Labeler Code.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Notification of registrant; blood product... PRODUCT LISTING FOR MANUFACTURERS OF HUMAN BLOOD AND BLOOD PRODUCTS Procedures for Domestic Blood Product Establishments § 607.35 Notification of registrant; blood product establishment registration number and...

  15. 21 CFR 607.35 - Notification of registrant; blood product establishment registration number and NDC Labeler Code.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Notification of registrant; blood product... PRODUCT LISTING FOR MANUFACTURERS OF HUMAN BLOOD AND BLOOD PRODUCTS Procedures for Domestic Blood Product Establishments § 607.35 Notification of registrant; blood product establishment registration number and...

  16. 21 CFR 607.35 - Notification of registrant; blood product establishment registration number and NDC Labeler Code.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Notification of registrant; blood product... PRODUCT LISTING FOR MANUFACTURERS OF HUMAN BLOOD AND BLOOD PRODUCTS Procedures for Domestic Blood Product Establishments § 607.35 Notification of registrant; blood product establishment registration number and...

  17. 21 CFR 607.35 - Notification of registrant; blood product establishment registration number and NDC Labeler Code.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Notification of registrant; blood product... PRODUCT LISTING FOR MANUFACTURERS OF HUMAN BLOOD AND BLOOD PRODUCTS Procedures for Domestic Blood Product Establishments § 607.35 Notification of registrant; blood product establishment registration number and...

  18. A comparison of the nutritional quality of food products advertised in grocery store circulars of high- versus low-income New York City zip codes.

    PubMed

    Ethan, Danna; Basch, Corey H; Rajan, Sonali; Samuel, Lalitha; Hammond, Rodney N

    2014-01-01

    Grocery stores can be an important resource for health and nutrition with the variety and economic value of foods offered. Weekly circulars are a means of promoting foods at a sale price. To date, little is known about the extent that nutritious foods are advertised and prominently placed in circulars. This study's aim was to compare the nutritional quality of products advertised on the front page of online circulars from grocery stores in high- versus low-income neighborhoods in New York City (NYC). Circulars from grocery stores in the five highest and five lowest median household income NYC zip codes were analyzed. Nutrition information for food products was collected over a two-month period with a total of 805 products coded. The study found no significant difference between the nutritional quality of products advertised on the front page of online circulars from grocery stores in high- versus low-income neighborhoods in New York City (NYC). In both groups, almost two-thirds of the products advertised were processed, one-quarter were high in carbohydrates, and few to no products were low-sodium, high-fiber, or reduced-, low- or zero fat. Through innovative partnerships with health professionals, grocery stores are increasingly implementing in-store and online health promotion strategies. Weekly circulars can be used as a means to regularly advertise and prominently place more healthful and seasonal foods at an affordable price, particularly for populations at higher risk for nutrition-related chronic disease. PMID:24384775

  19. An overview of (124)I production at a medical cyclotron by ALICE/ASH, EMPIRE-3.2.2 and TALYS-1.6 codes.

    PubMed

    Azizakram, Hamid; Sadeghi, Mahdi; Ashtari, Parviz; Zolfagharpour, Farhad

    2016-06-01

    Excitation functions of proton, deuteron and alpha induced nuclear reactions on enriched tellurium and antimony isotopes and also natural antimony were calculated by ALICE/ASH, EMPIRE-3.2.2 and TALYS-1.6 nuclear codes. Therefrom, the production yield of (124)I nuclide and its formed impurities were calculated by using the evaluation results. The calculated cross sections were compared with available experimental values in literatures. According to results, (124)Te(p,n)(124)I reaction is the method of choice due to formation of higher amount of (124)I nuclide and low levels of (125)I as an major concern in (124)I production. PMID:27060194

  20. Speech coding

    NASA Astrophysics Data System (ADS)

    Gersho, Allen

    1990-05-01

    Recent advances in algorithms and techniques for speech coding now permit high quality voice reproduction at remarkably low bit rates. The advent of powerful single-ship signal processors has made it cost effective to implement these new and sophisticated speech coding algorithms for many important applications in voice communication and storage. Some of the main ideas underlying the algorithms of major interest today are reviewed. The concept of removing redundancy by linear prediction is reviewed, first in the context of predictive quantization or DPCM. Then linear predictive coding, adaptive predictive coding, and vector quantization are discussed. The concepts of excitation coding via analysis-by-synthesis, vector sum excitation codebooks, and adaptive postfiltering are explained. The main idea of vector excitation coding (VXC) or code excited linear prediction (CELP) are presented. Finally low-delay VXC coding and phonetic segmentation for VXC are described.

  1. Theory of epigenetic coding.

    PubMed

    Elder, D

    1984-06-01

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward. PMID:6748695

  2. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  3. Raising the standard: changes to the Australian Code of Good Manufacturing Practice (cGMP) for human blood and blood components, human tissues and human cellular therapy products.

    PubMed

    Wright, Craig; Velickovic, Zlatibor; Brown, Ross; Larsen, Stephen; Macpherson, Janet L; Gibson, John; Rasko, John E J

    2014-04-01

    In Australia, manufacture of blood, tissues and biologicals must comply with the federal laws and meet the requirements of the Therapeutic Goods Administration (TGA) Manufacturing Principles as outlined in the current Code of Good Manufacturing Practice (cGMP). The Therapeutic Goods Order (TGO) No. 88 was announced concurrently with the new cGMP, as a new standard for therapeutic goods. This order constitutes a minimum standard for human blood, tissues and cellular therapeutic goods aimed at minimising the risk of infectious disease transmission. The order sets out specific requirements relating to donor selection, donor testing and minimisation of infectious disease transmission from collection and manufacture of these products. The Therapeutic Goods Manufacturing Principles Determination No. 1 of 2013 references the human blood and blood components, human tissues and human cellular therapy products 2013 (2013 cGMP). The name change for the 2013 cGMP has allowed a broadening of the scope of products to include human cellular therapy products. It is difficult to directly compare versions of the code as deletion of some clauses has not changed the requirements to be met, as they are found elsewhere amongst the various guidelines provided. Many sections that were specific for blood and blood components are now less prescriptive and apply to a wider range of cellular therapies, but the general overall intent remains the same. Use of 'should' throughout the document instead of 'must' allows flexibility for alternative processes, but these systems will still require justification by relevant logical argument and validation data to be acceptable to TGA. The cGMP has seemingly evolved so that specific issues identified at audit over the last decade have now been formalised in the new version. There is a notable risk management approach applied to most areas that refer to process justification and decision making. These requirements commenced on 31 May 2013 and a 12 month

  4. Computer Code

    NASA Technical Reports Server (NTRS)

    1985-01-01

    COSMIC MINIVER, a computer code developed by NASA for analyzing aerodynamic heating and heat transfer on the Space Shuttle, has been used by Marquardt Company to analyze heat transfer on Navy/Air Force missile bodies. The code analyzes heat transfer by four different methods which can be compared for accuracy. MINIVER saved Marquardt three months in computer time and $15,000.

  5. DNA codes

    SciTech Connect

    Torney, D. C.

    2001-01-01

    We have begun to characterize a variety of codes, motivated by potential implementation as (quaternary) DNA n-sequences, with letters denoted A, C The first codes we studied are the most reminiscent of conventional group codes. For these codes, Hamming similarity was generalized so that the score for matched letters takes more than one value, depending upon which letters are matched [2]. These codes consist of n-sequences satisfying an upper bound on the similarities, summed over the letter positions, of distinct codewords. We chose similarity 2 for matches of letters A and T and 3 for matches of the letters C and G, providing a rough approximation to double-strand bond energies in DNA. An inherent novelty of DNA codes is 'reverse complementation'. The latter may be defined, as follows, not only for alphabets of size four, but, more generally, for any even-size alphabet. All that is required is a matching of the letters of the alphabet: a partition into pairs. Then, the reverse complement of a codeword is obtained by reversing the order of its letters and replacing each letter by its match. For DNA, the matching is AT/CG because these are the Watson-Crick bonding pairs. Reversal arises because two DNA sequences form a double strand with opposite relative orientations. Thus, as will be described in detail, because in vitro decoding involves the formation of double-stranded DNA from two codewords, it is reasonable to assume - for universal applicability - that the reverse complement of any codeword is also a codeword. In particular, self-reverse complementary codewords are expressly forbidden in reverse-complement codes. Thus, an appropriate distance between all pairs of codewords must, when large, effectively prohibit binding between the respective codewords: to form a double strand. Only reverse-complement pairs of codewords should be able to bind. For most applications, a DNA code is to be bi-partitioned, such that the reverse-complementary pairs are separated

  6. An audit of health products and services marketed on chiropractic websites in Alberta and consideration of these practices in the context of chiropractic codes of conduct and ethics

    PubMed Central

    Page, Stacey A.

    2007-01-01

    Background Chiropractic’s success as a health care profession is evidenced in part by the rising number of practitioners. Paradoxically, this success may start to cost the profession, as the number of consumers may not be increasing proportionally. Fewer patients mean less income for practitioners. Some chiropractors are responding to these pressures by marketing health products, and services Objectives To describe the extent to which Alberta chiropractors with websites sold health products and the extent to which fee discounts/service inducements were advertised. To consider these practices in the context of chiropractic codes of conduct and ethics. Methods Chiropractic websites in the province of Alberta were identified using the online Telus Business Finder and cross-referenced with the Yellow Pages print directories. The websites were searched and an inventory of the health products for sale was recorded. Fee discounts and service inducements were also recorded. Results 56 websites were identified and reviewed. Just under two-thirds of the chiropractic websites surveyed contained information on health products for sale. Orthotics were sold most often (N = 29 practices; 51.8%), followed by pillows and supports (N = 15: 26.8%), vitamins/nutritional supplements (N = 15; 26.8%) and exercise/rehabilitation products (N = 10; 17.9%). Nine practices (16.1%) offered some type of inducement to potential customers. These included discounts on treatment packages (N = 2; 3.6%), free gait/ posture analyses (N = 2; 3.6%) and free general consultations with the chiropractors (N = 3; 5.4%) Conclusions The marketing of health care products and services by chiropractors in Alberta is common. Such practices raise ethical considerations for the profession. Professional guidelines vary on the acceptability of these practices. Consumer and practitioner perspectives and practices regarding retailing need to be further examined. PMID:17657302

  7. A New Model for Real-Time Regional Vertical Total Electron Content and Differential Code Bias Estimation Using IGS Real-Time Service (IGS-RTS) Products

    NASA Astrophysics Data System (ADS)

    Abdelazeem, Mohamed; Çelik, Rahmi N.; El-Rabbany, Ahmed

    2016-04-01

    The international global navigation satellite system (GNSS) real-time service (IGS-RTS) products have been used extensively for real-time precise point positioning and ionosphere modeling applications. In this study, we develop a regional model for real-time vertical total electron content (RT-VTEC) and differential code bias (RT-DCB) estimation over Europe using the IGS-RTS satellite orbit and clock products. The developed model has a spatial and temporal resolution of 1°×1° and 15 minutes, respectively. GPS observations from a regional network consisting of 60 IGS and EUREF reference stations are processed in the zero-difference mode using the Bernese-5.2 software package in order to extract the geometry-free linear combination of the smoothed code observations. The spherical harmonic expansion function is used to model the VTEC, the receiver and the satellite DCBs. To validate the proposed model, the RT-VTEC values are computed and compared with the final IGS-global ionospheric map (IGS-GIM) counterparts in three successive days under high solar activity including one of an extreme geomagnetic activity. The real-time satellite DCBs are also estimated and compared with the IGS-GIM counterparts. Moreover, the real-time receiver DCB for six IGS stations are obtained and compared with the IGS-GIM counterparts. The examined stations are located in different latitudes with different receiver types. The findings reveal that the estimated RT-VTEC values show agreement with the IGS-GIM counterparts with root mean-square-errors (RMSEs) values less than 2 TEC units. In addition, RMSEs of both the satellites and receivers DCBs are less than 0.85 ns and 0.65 ns, respectively in comparison with the IGS-GIM.

  8. Production of Monoclonal Antibody Against Recombinant Polypeptide From the Erns Coding Region of the Bovine Viral Diarrhea Virus

    PubMed Central

    Seyfi Abad Shapouri, Masood Reza; Ekhtelat, Maryam; Ghorbanpoor Najaf Abadi, Masood; Mahmoodi Koohi, Pezhman; Lotfi, Mohsen

    2015-01-01

    Background: Bovine viral diarrhea (BVD) is an economically important cattle disease with a worldwide distribution. Detection and elimination of animals persistently infected (PI) with bovine viral diarrhea virus (BVDV) is essential for the control of BVD and eradication of BVDV. There are usually no pathognomonic clinical signs of BVDV infection. Diagnostic investigations therefore rely on laboratory-based detection of the virus, or virus-induced antigens or antibodies. Objectives: Erns as an immunogenic protein of BVDV, is genetically and antigenically conserved among different isolates and therefore, is a candidate antigen for development of the enzyme linked immunosorbent assay (ELISA) for serological studies or identification of PI animals. The aim of this study was to produce a monoclonal antibody (MAb) against recombinant Erns. Materials and Methods: For this purpose, recombinant maltose-binding protein (MBP)-Erns protein was expressed in Escherichia coli and purified using amylose resin chromatography column and used as an antigen in MAb production. Spleen cells of the immunized mice with the recombinant antigen were fused with SP2/0 myeloma cells. Next, culture supernatants of primary clones of fused cells were screened by indirect ELISA. After three rounds of cloning, the reactivity of the MAbs with recombinant and natural antigen was established by Western blotting. Results: Based on our results, MAb against recombinant Erns was produced and reacted successfully with recombinant and natural antigens. Conclusions: With regards to the role of Erns in the identification of PI animals, it appears that Erns recombinant antigen and the specific monoclonal antibodies produced against it may be suitable for developing BVDV laboratory diagnostic assays. PMID:26870309

  9. Study of components and statistical reaction mechanism in simulation of nuclear process for optimized production of {sup 64}Cu and {sup 67}Ga medical radioisotopes using TALYS, EMPIRE and LISE++ nuclear reaction and evaporation codes

    SciTech Connect

    Nasrabadi, M. N. Sepiani, M.

    2015-03-30

    Production of medical radioisotopes is one of the most important tasks in the field of nuclear technology. These radioactive isotopes are mainly produced through variety nuclear process. In this research, excitation functions and nuclear reaction mechanisms are studied for simulation of production of these radioisotopes in the TALYS, EMPIRE and LISE++ reaction codes, then parameters and different models of nuclear level density as one of the most important components in statistical reaction models are adjusted for optimum production of desired radioactive yields.

  10. Study of components and statistical reaction mechanism in simulation of nuclear process for optimized production of 64Cu and 67Ga medical radioisotopes using TALYS, EMPIRE and LISE++ nuclear reaction and evaporation codes

    NASA Astrophysics Data System (ADS)

    Nasrabadi, M. N.; Sepiani, M.

    2015-03-01

    Production of medical radioisotopes is one of the most important tasks in the field of nuclear technology. These radioactive isotopes are mainly produced through variety nuclear process. In this research, excitation functions and nuclear reaction mechanisms are studied for simulation of production of these radioisotopes in the TALYS, EMPIRE & LISE++ reaction codes, then parameters and different models of nuclear level density as one of the most important components in statistical reaction models are adjusted for optimum production of desired radioactive yields.

  11. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  12. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  13. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  14. During visual word recognition, phonology is accessed within 100 ms and may be mediated by a speech production code: evidence from magnetoencephalography.

    PubMed

    Wheat, Katherine L; Cornelissen, Piers L; Frost, Stephen J; Hansen, Peter C

    2010-04-14

    Debate surrounds the precise cortical location and timing of access to phonological information during visual word recognition. Therefore, using whole-head magnetoencephalography (MEG), we investigated the spatiotemporal pattern of brain responses induced by a masked pseudohomophone priming task. Twenty healthy adults read target words that were preceded by one of three kinds of nonword prime: pseudohomophones (e.g., brein-BRAIN), where four of five letters are shared between prime and target, and the pronunciation is the same; matched orthographic controls (e.g., broin-BRAIN), where the same four of five letters are shared between prime and target but pronunciation differs; and unrelated controls (e.g., lopus-BRAIN), where neither letters nor pronunciation are shared between prime and target. All three priming conditions induced activation in the pars opercularis of the left inferior frontal gyrus (IFGpo) and the left precentral gyrus (PCG) within 100 ms of target word onset. However, for the critical comparison that reveals a processing difference specific to phonology, we found that the induced pseudohomophone priming response was significantly stronger than the orthographic priming response in left IFG/PCG at approximately 100 ms. This spatiotemporal concurrence demonstrates early phonological influences during visual word recognition and is consistent with phonological access being mediated by a speech production code. PMID:20392945

  15. Astrophysics Source Code Library Enhancements

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.; Allen, A.; Berriman, G. B.; DuPrie, K.; Mink, J.; Nemiroff, R. J.; Schmidt, J.; Shamir, L.; Shortridge, K.; Taylor, M.; Teuben, P. J.; Wallin, J.

    2015-09-01

    The Astrophysics Source Code Library (ASCL)1 is a free online registry of codes used in astronomy research; it currently contains over 900 codes and is indexed by ADS. The ASCL has recently moved a new infrastructure into production. The new site provides a true database for the code entries and integrates the WordPress news and information pages and the discussion forum into one site. Previous capabilities are retained and permalinks to ascl.net continue to work. This improvement offers more functionality and flexibility than the previous site, is easier to maintain, and offers new possibilities for collaboration. This paper covers these recent changes to the ASCL.

  16. Characterization of Multiple Regions Involved in Replication and Mobilization of Plasmid pNZ4000 Coding for Exopolysaccharide Production in Lactococcus lactis

    PubMed Central

    van Kranenburg, Richard; de Vos, Willem M.

    1998-01-01

    We characterized the regions involved in replication and mobilization of the 40-kb plasmid pNZ4000, encoding exopolysaccharide (EPS) production in Lactococcus lactis NIZO B40. The plasmid contains four highly conserved replication regions with homologous rep genes (repB1, repB2, repB3, and repB4) that belong to the lactococcal theta replicon family. Subcloning of each replicon individually showed that all are functional and compatible in L. lactis. Plasmid pNZ4000 and genetically labeled derivatives could be transferred to different L. lactis strains by conjugation, and pNZ4000 was shown to be a mobilization plasmid. Two regions involved in mobilization were identified near two of the replicons; both included an oriT sequence rich in inverted repeats. Conjugative mobilization of the nonmobilizable plasmid pNZ124 was promoted by either one of these oriT sequences, demonstrating their functionality. One oriT sequence was followed by a mobA gene, coding for a trans-acting protein, which increased the frequency of conjugative transfer 100-fold. The predicted MobA protein and the oriT sequences show protein and nucleotide similarity, respectively, with the relaxase and with the inverted repeat and nic site of the oriT from the Escherichia coli plasmid R64. The presence on pNZ4000 of four functional replicons, two oriT sequences, and several insertion sequence-like elements strongly suggests that this EPS plasmid is a naturally occurring cointegrate. PMID:9765557

  17. A de novo transcriptome of the Malpighian tubules in non-blood-fed and blood-fed Asian tiger mosquitoes Aedes albopictus: insights into diuresis, detoxification, and blood meal processing

    PubMed Central

    Esquivel, Carlos J.; Cassone, Bryan J.

    2016-01-01

    Background. In adult female mosquitoes, the renal (Malpighian) tubules play an important role in the post-prandial diuresis, which removes excess ions and water from the hemolymph of mosquitoes following a blood meal. After the post-prandial diuresis, the roles that Malpighian tubules play in the processing of blood meals are not well described. Methods. We used a combination of next-generation sequencing (paired-end RNA sequencing) and physiological/biochemical assays in adult female Asian tiger mosquitoes (Aedes albopictus) to generate molecular and functional insights into the Malpighian tubules and how they may contribute to blood meal processing (3–24 h after blood ingestion). Results/Discussion. Using RNA sequencing, we sequenced and assembled the first de novo transcriptome of Malpighian tubules from non-blood-fed (NBF) and blood-fed (BF) mosquitoes. We identified a total of 8,232 non-redundant transcripts. The Malpighian tubules of NBF mosquitoes were characterized by the expression of transcripts associated with active transepithelial fluid secretion/diuresis (e.g., ion transporters, water channels, V-type H+-ATPase subunits), xenobiotic detoxification (e.g., cytochrome P450 monoxygenases, glutathione S-transferases, ATP-binding cassette transporters), and purine metabolism (e.g., xanthine dehydrogenase). We also detected the expression of transcripts encoding sodium calcium exchangers, G protein coupled-receptors, and septate junctional proteins not previously described in mosquito Malpighian tubules. Within 24 h after a blood meal, transcripts associated with active transepithelial fluid secretion/diuresis exhibited a general downregulation, whereas those associated with xenobiotic detoxification and purine catabolism exhibited a general upregulation, suggesting a reinvestment of the Malpighian tubules’ molecular resources from diuresis to detoxification. Physiological and biochemical assays were conducted in mosquitoes and isolated Malpighian

  18. H.R.3688: A bill to amend the Internal Revenue Code of 1986 to provide a tax credit for marginal oil and natural gas well production, introduced in the House of Representatives, One Hundred Fifth Congress, Second Session, April 1, 1998

    SciTech Connect

    1998-12-31

    This bill proposes a new section to be added to the Internal Revenue Code of 1986. The credit proposed is $3 per barrel of qualified crude oil production and 50 cents per 1,000 cubic feet of qualified natural gas production. In this case qualified production means domestic crude oil or natural gas which is produced from a marginal well. Marginal production is defined within the Internal Revenue Code Section 613A(c)(6).

  19. Codes with special correlation.

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.

    1964-01-01

    Uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets

  20. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  1. Portable code development in C

    SciTech Connect

    Brown, S.A.

    1990-11-06

    With a new generation of high performance computers appearing around us on a time scale of months, a new challenge for developers of simulation codes is to write and maintain production codes that are both highly portable and maximally efficient. My contention is that C is the language that is both best suited to that goal and is widely available today. GLF is a new code written mainly in C which is intended to have all of the XRASER physics and run on any platform of interest. It demonstrates the power of the C paradigm for code developers and flexibility and ease of use for the users. Three fundamental problems are discussed: the C/UNIX development environment; the supporting tools and libraries which handle data and graphics portability issues; and the advantages of C in numerical simulation code development.

  2. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  3. Dominance and parent-of-origin effects of coding and non-coding alleles at the acylCoA-diacylglycerol-acyltransferase (DGAT1) gene on milk production traits in German Holstein cows

    PubMed Central

    Kuehn, Christa; Edel, Christian; Weikard, Rosemarie; Thaller, Georg

    2007-01-01

    Background Substantial gene substitution effects on milk production traits have formerly been reported for alleles at the K232A and the promoter VNTR loci in the bovine acylCoA-diacylglycerol-acyltransferase 1 (DGAT1) gene by using data sets including sires with accumulated phenotypic observations of daughters (breeding values, daughter yield deviations). However, these data sets prevented analyses with respect to dominance or parent-of-origin effects, although an increasing number of reports in the literature outlined the relevance of non-additive gene effects on quantitative traits. Results Based on a data set comprising German Holstein cows with direct trait measurements, we first confirmed the previously reported association of DGAT1 promoter VNTR alleles with milk production traits. We detected a dominant mode of effects for the DGAT1 K232A and promoter VNTR alleles. Namely, the contrasts between the effects of heterozygous individuals at the DGAT1 loci differed significantly from the midpoint between the effects for the two homozygous genotypes for several milk production traits, thus indicating the presence of dominance. Furthermore, we identified differences in the magnitude of effects between paternally and maternally inherited DGAT1 promoter VNTR – K232A haplotypes indicating parent-of-origin effects on milk production traits. Conclusion Non-additive effects like those identified at the bovine DGAT1 locus have to be accounted for in more specific QTL detection models as well as in marker assisted selection schemes. The DGAT1 alleles in cattle will be a useful model for further investigations on the biological background of non-additive effects in mammals due to the magnitude and consistency of their effects on milk production traits. PMID:17892573

  4. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  5. CODE's contribution to the IGS MGEX

    NASA Astrophysics Data System (ADS)

    Prange, Lars; Dach, Rolf; Lutz, Simon; Schaer, Stefan; Jäggi, Adrian

    2014-05-01

    The Center for Orbit Determination in Europe (CODE) is contributing as a global analysis center to the International GNSS Service (IGS) since many years. The processing of GPS and GLONASS data is well established in CODE's ultra-rapid, rapid, and final product lines. Since 2012 CODE contributes to the "Multi GNSS EXperiment" (MGEX), launched by the IGS as a testbed for the incorporation of new GNSS and their signals into the existing IGS processing chains and software packages. The focus of CODE's MGEX activities was on Galileo so far. Comparisons with other groups results proved the quality of CODE's Galileo orbit (based on a 3-day long-arc solution) and clock products. The MGEX processing at CODE is currently extended to the BeiDou system, which will result in a fully consistent quadruple-system solution including GPS, GLONASS, Galileo, and BeiDou. We present the latest status of the CODE MGEX processing. The quality of the orbit and clock solutions will be evaluated. The characteristics and the impact of the contributing GNSS on the products will be assessed. The CODE MGEX orbit and clock products are publicly available in the IGS MGEX products directory at the CDDIS data center: ftp://cddis.gsfc.nasa.gov/gnss/products/mgex (the solution ID "com" stands for CODE-MGEX).

  6. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue. PMID:26633789

  7. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  8. Food Product Dating

    MedlinePlus

    ... Formula What do can codes mean? Dates on Egg Cartons UPC or Bar Codes Storage Times Refrigerator ... primarily on perishable foods such as meat, poultry, eggs and dairy products. "Closed" or "coded" dating might ...

  9. A novel coding-region RNA element modulates infectious dengue virus particle production in both mammalian and mosquito cells and regulates viral replication in Aedes aegypti mosquitoes

    PubMed Central

    Groat-Carmona, Anna Maria; Orozco, Susana; Friebe, Peter; Payne, Anne; Kramer, Laura; Harris, Eva

    2013-01-01

    Dengue virus (DENV) is an enveloped flavivirus with a positive-sense RNA genome transmitted by Aedes mosquitoes, causing the most important arthropod-borne viral disease affecting humans. Relatively few cis-acting RNA regulatory elements have been described in the DENV coding-region. Here, by introducing silent mutations into a DENV-2 infectious clone, we identify the conserved capsid-coding region 1 (CCR1), an RNA sequence element that regulates viral replication in mammalian cells and to a greater extent in Ae. albopictus mosquito cells. These defects were confirmed in vivo, resulting in decreased replication in Ae. aegypti mosquito bodies and dissemination to the salivary glands. Furthermore, CCR1 does not regulate translation, RNA synthesis or virion retention but likely modulates assembly, as mutations resulted in the release of non-infectious viral particles from both cell types. Understanding the role of CCR1 could help characterize the poorly-defined stage of assembly in the DENV life cycle and uncover novel anti-viral targets. PMID:22840606

  10. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  11. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  12. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  13. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  14. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  15. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  16. Binary primitive alternant codes

    NASA Technical Reports Server (NTRS)

    Helgert, H. J.

    1975-01-01

    In this note we investigate the properties of two classes of binary primitive alternant codes that are generalizations of the primitive BCH codes. For these codes we establish certain equivalence and invariance relations and obtain values of d and d*, the minimum distances of the prime and dual codes.

  17. Algebraic geometric codes

    NASA Technical Reports Server (NTRS)

    Shahshahani, M.

    1991-01-01

    The performance characteristics are discussed of certain algebraic geometric codes. Algebraic geometric codes have good minimum distance properties. On many channels they outperform other comparable block codes; therefore, one would expect them eventually to replace some of the block codes used in communications systems. It is suggested that it is unlikely that they will become useful substitutes for the Reed-Solomon codes used by the Deep Space Network in the near future. However, they may be applicable to systems where the signal to noise ratio is sufficiently high so that block codes would be more suitable than convolutional or concatenated codes.

  18. Serial-Turbo-Trellis-Coded Modulation with Rate-1 Inner Code

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Sam; Pollara, Fabrizio

    2004-01-01

    Serially concatenated turbo codes have been proposed to satisfy requirements for low bit- and word-error rates and for low (in comparison with related previous codes) complexity of coding and decoding algorithms and thus low complexity of coding and decoding circuitry. These codes are applicable to such high-level modulations as octonary phase-shift keying (8PSK) and 16-state quadrature amplitude modulation (16QAM); the signal product obtained by applying one of these codes to one of these modulations is denoted, generally, as serially concatenated trellis-coded modulation (SCTCM). These codes could be particularly beneficial for communication systems that must be designed and operated subject to limitations on bandwidth and power. Some background information is prerequisite to a meaningful summary of this development. Trellis-coded modulation (TCM) is now a well-established technique in digital communications. A turbo code combines binary component codes (which typically include trellis codes) with interleaving. A turbo code of the type that has been studied prior to this development is composed of parallel concatenated convolutional codes (PCCCs) implemented by two or more constituent systematic encoders joined through one or more interleavers. The input information bits feed the first encoder and, after having been scrambled by the interleaver, enter the second encoder. A code word of a parallel concatenated code consists of the input bits to the first encoder followed by the parity check bits of both encoders. The suboptimal iterative decoding structure for such a code is modular, and consists of a set of concatenated decoding modules one for each constituent code connected through an interleaver identical to the one in the encoder side. Each decoder performs weighted soft decoding of the input sequence. PCCCs yield very large coding gains at the cost of a reduction in the data rate and/or an increase in bandwidth.

  19. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  20. The Integrated TIGER Series Codes

    Energy Science and Technology Software Center (ESTSC)

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with anmore » input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  1. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  2. Drug data coding and analysis in epidemiologic studies.

    PubMed

    Pahor, M; Chrischilles, E A; Guralnik, J M; Brown, S L; Wallace, R B; Carbonin, P

    1994-08-01

    In epidemiologic studies that collect comprehensive information on medication use, the complexity of dealing with a large number of trade and generic names may limit the utilization of these data bases. This paper shows the specific advantage of using two coding systems, one to maximize efficiency of data entry, and the other to facilitate analysis by organizing the drug ingredients into hierarchical categories. The approach used by two large surveys, one in the USA and one in Italy, is described: the Established Populations for Epidemiologic Studies of the Elderly (EPESE) and the 'Gruppo Italiano di Farmacovigilanza nell' Anziano' (GIFA). To enter the medications into a computerized database, codes matching the drug product names are needed. In the EPESE the prescription and over the counter drug products are coded with the Drug Products Information Coding System (DPICS) and the Iowa Nonprescription Drug Products Information Coding System (INDPICS), respectively. The GIFA study uses the coding system of the Italian Ministry of Health (MINSAN), with a unique numeric code for each drug product available in Italy. To simplify the analytical process the drug entry codes are converted into hierarchical coding systems with unique codes for specific drug ingredients, chemical and therapeutic categories. The EPESE and GIFA drug data are coded with the Iowa Drug Information System (IDIS) ingredient codes, and the Anatomical Therapeutic and chemical (ATC) codes, respectively. Examples are provided that show coding of diuretics in these two studies and demonstrate the analytic advantages of these systems. PMID:7843344

  3. DELightcurveSimulation: Light curve simulation code

    NASA Astrophysics Data System (ADS)

    Connolly, Samuel D.

    2016-02-01

    DELightcurveSimulation simulates light curves with any given power spectral density and any probability density function, following the algorithm described in Emmanoulopoulos et al. (2013). The simulated products have exactly the same variability and statistical properties as the observed light curves. The code is a Python implementation of the Mathematica code provided by Emmanoulopoulos et al.

  4. Asymmetric quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    La Guardia, Giuliano G.

    2016-01-01

    In this paper, we construct the first families of asymmetric quantum convolutional codes (AQCCs). These new AQCCs are constructed by means of the CSS-type construction applied to suitable families of classical convolutional codes, which are also constructed here. The new codes have non-catastrophic generator matrices, and they have great asymmetry. Since our constructions are performed algebraically, i.e. we develop general algebraic methods and properties to perform the constructions, it is possible to derive several families of such codes and not only codes with specific parameters. Additionally, several different types of such codes are obtained.

  5. The trellis complexity of convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Lin, W.

    1995-01-01

    It has long been known that convolutional codes have a natural, regular trellis structure that facilitates the implementation of Viterbi's algorithm. It has gradually become apparent that linear block codes also have a natural, though not in general a regular, 'minimal' trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of the Viterbi decoding algorithm can be accurately estimated by the number of trellis edges per encoded bit. It would, therefore, appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations that are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the minimal trellis representation. Thus, ironically, at present we seem to know more about the minimal trellis representation for block than for convolutional codes. In this article, we provide a remedy, by developing a theory of minimal trellises for convolutional codes. (A similar theory has recently been given by Sidorenko and Zyablov). This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-minimal generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that, in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.

  6. Development of recombinant Escherichia coli whole-cell biocatalyst expressing a novel alkaline lipase-coding gene from Proteus sp. for biodiesel production.

    PubMed

    Gao, Bei; Su, Erzheng; Lin, Jinping; Jiang, Zhengbing; Ma, Yushu; Wei, Dongzhi

    2009-01-15

    A lipase-producing bacterium K107 was isolated from soil samples of China and identified to be a strain of Proteus sp. With genome-walking method, the open reading frame of lipase gene lipK107, encoding 287 amino acids, was cloned and expressed in a heterologous host, Escherichia coli BL21 (DE3). The recombinant lipase was purified and characterized, and the optimum pH of the purified LipK107 was 9, at 35 degrees C. The recombinant E. coli expressing lipK107 was applied in biodiesel production in the form of whole-cell biocatalyst. Activity of the biocatalyst increased significantly when cells were permeabilized with 0.3% (w/v) cetyl-trimethylammoniumbromide (CTAB). This transesterification was carried out efficiently in a mixture containing 5M equivalents of methanol to the oil and 100% water by weight of the substrate. It was the first time to use E. coli whole-cell biocatalyst expressing lipase in biodiesel production, and the biodiesel reached a yield of nearly 100% after 12h reaction at the optimal temperature of 15 degrees C, which was the lowest temperature among all the known catalyst in biodiesel production. PMID:19007827

  7. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  8. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  9. Multiple Turbo Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    A description is given of multiple turbo codes and a suitable decoder structure derived from an approximation to the maximum a posteriori probability (MAP) decision rule, which is substantially different from the decoder for two-code-based encoders.

  10. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  11. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  12. Itaconic acid production from glycerol using Escherichia coli harboring a random synonymous codon-substituted 5'-coding region variant of the cadA gene.

    PubMed

    Jeon, Ho-Geun; Cheong, Dae-Eun; Han, Yunjon; Song, Jae Jun; Choi, Jong Hyun

    2016-07-01

    Aspergillus terreus cadA, encoding cis-aconitate decarboxylase, is an essential gene for itaconic acid (IA) biosynthesis, but it is primarily expressed as insoluble aggregates in most industrial hosts. This has been a hurdle for the development of recombinant strategies for IA production. Here, we created a library of synonymous codon variants (scv) of the cadA gene containing synonymous codons in the first 10 codons (except ATG) and screened it in Escherichia coli. Among positive clones, E. coli scvCadA_No8 showed more than 95% of expressed CadA in the soluble fraction, and in production runs, produced threefold more IA than wild-type E. coli in Luria-Bertani broth supplemented with 0.5% glucose. In M9 minimal media containing 0.85 g/L citrate and 1% glycerol, E. coli scvCadA_No8 produced 985.6 ± 33.4 mg/L IA during a 72-h culture after induction with isopropyl β-D-1-thiogalactopyranoside. In a 2-L fed-batch fermentation consisting of two stages (growth and nitrogen limitation conditions), we obtained 7.2 g/L IA by using E. coli by introducing only the scv_cadA gene and optimizing culture conditions for IA production. These results could be combined with metabolic engineering and generate an E. coli strain as an industrial IA producer. Biotechnol. Bioeng. 2016;113: 1504-1510. © 2015 Wiley Periodicals, Inc. PMID:26704570

  13. Color code identification in coded structured light.

    PubMed

    Zhang, Xu; Li, Youfu; Zhu, Limin

    2012-08-01

    Color code is widely employed in coded structured light to reconstruct the three-dimensional shape of objects. Before determining the correspondence, a very important step is to identify the color code. Until now, the lack of an effective evaluation standard has hindered the progress in this unsupervised classification. In this paper, we propose a framework based on the benchmark to explore the new frontier. Two basic facets of the color code identification are discussed, including color feature selection and clustering algorithm design. First, we adopt analysis methods to evaluate the performance of different color features, and the order of these color features in the discriminating power is concluded after a large number of experiments. Second, in order to overcome the drawback of K-means, a decision-directed method is introduced to find the initial centroids. Quantitative comparisons affirm that our method is robust with high accuracy, and it can find or closely approach the global peak. PMID:22859022

  14. Lycopene production in recombinant strains of Escherichia coli is improved by knockout of the central carbon metabolism gene coding for glucose-6-phosphate dehydrogenase.

    PubMed

    Zhou, Yan; Nambou, Komi; Wei, Liujing; Cao, Jingjing; Imanaka, Tadayuki; Hua, Qiang

    2013-12-01

    Genetic manipulation was undertaken in order to understand the mechanism involved in the heterologous synthesis of lycopene in Escherichia coli. Knockout of the central carbon metabolic gene zwf (glucose-6-phosphate dehydrogenase) resulted in the enhancement of lycopene production (above 130 % relative to control). The amplification and overexpression of rate-limiting steps encoded by idi (isopentenyl diphosphate isomerase), dxs (1-deoxyxylulose-5-phosphate synthase) and ispDF (4-diphosphocytidyl-2C-methyl-D-erythritol synthase and 2C-methyl-D-erythritol 2,4-cyclodiphosphate synthase) genes improved lycopene synthesis from 0.89 to 5.39 mg g(-1) DCW. The combination of central metabolic genes knockout with the amplification of MEP pathway genes yielded best amounts of lycopene (6.85-7.55 mg g(-1) DCW). Transcript profiling revealed that idi and dxs were up-regulated in the zwf knock-out strain, providing a plausible explanation for the increase in lycopene yield observed in this strain. An increase in precursor availability might also have contributed to the improved lycopene production. PMID:24062132

  15. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  16. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  17. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  18. DLLExternalCode

    Energy Science and Technology Software Center (ESTSC)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read frommore » files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.« less

  19. Temporal characterization of protein production levels from baculovirus vectors coding for GFP and RFP genes under non-conventional promoter control.

    PubMed

    George, Steve; Jauhar, Altamash M; Mackenzie, Jennifer; Kieβlich, Sascha; Aucoin, Marc G

    2015-09-01

    The ease of use and versatility of the Baculovirus Expression Vector System (BEVS) has made it one of the most widely used systems for recombinant protein production However, co-expression systems currently in use mainly make use of the very strong very late p10 and polyhedron (polh) promoters to drive expression of foreign genes, which does not provide much scope for tailoring expression ratios within the cell. This work demonstrates the use of different Autographa californica multicapsid nucleopolyhedrovirus (AcMNPV) promoters to control the timing and expression of two easily traceable fluorescent proteins, the enhanced green fluorescent protein (eGFP), and a red fluorescent protein (DsRed2) in a BEVS co-expression system. Our results show that gene expression levels can easily be controlled using this strategy, and also that modulating the expression level of one protein can influence the level of expression of the other protein within the system, thus confirming the concept of genes "competing" for limited cellular resources. Plots of "expression ratios" of the two model genes over time were obtained, and may be used in future work to tightly control timing and levels of foreign gene expression in an insect cell co-expression system. PMID:25850946

  20. Cloning of genes coding for L-sorbose and L-sorbosone dehydrogenases from Gluconobacter oxydans and microbial production of 2-keto-L-gulonate, a precursor of L-ascorbic acid, in a recombinant G. oxydans strain.

    PubMed Central

    Saito, Y; Ishii, Y; Hayashi, H; Imao, Y; Akashi, T; Yoshikawa, K; Noguchi, Y; Soeda, S; Yoshida, M; Niwa, M; Hosoda, J; Shimomura, K

    1997-01-01

    We have purified L-sorbose dehydrogenase (SDH) and L-sorbosone dehydrogenase (SNDH) from Gluconobacter oxydans T-100 that showed an ability to convert D-sorbitol to 2-keto-L-gulonate (2-KLGA). A genomic library of Gluconobacter oxydans T-100 was screened with a probe, a 180-bp PCR product which was obtained from degenerate oligodeoxyribonucleotides based on the elucidated sequence of the purified SDH (used as primers) and the genomic DNA of G. oxydans T-100 (used as a template). From sequencing of the DNA from a clone positive to the probe, the SNDH and the SDH were estimated to be coded in sequential open reading frames with 1,497 and 1,599 nucleotides, respectively, which was confirmed by expression of the DNA in Escherichia coli that showed both enzymatic activities. The DNA was introduced to a shuttle vector which was prepared from a plasmid of G. oxydans T-100 and pHSG298 to obtain an expression vector designated pSDH155. The production of 2-KLGA by pSDH155 in G. oxydans G624, an L-sorbose-accumulating strain, was improved to 230% compared to that of G. oxydans T-100. Chemical mutation of the host strain to suppress the L-idonate pathway and replacement of the original promoter with that of E. coli tufB resulted in improving the production of 2-KLGA. Consequently, high-level production from D-sorbitol to 2-KLGA (130 mg/ml) was achieved by simple fermentation of the recombinant Gluconobacter. PMID:9023923

  1. MGEX data analysis at CODE - current status

    NASA Astrophysics Data System (ADS)

    Prange, Lars; Dach, Rolf; Lutz, Simon; Schaer, Stefan; Jäggi, Adrian

    2013-04-01

    The Center for Orbit Determination in Europe (CODE) is contributing as an analysis center to the International GNSS Service (IGS) since many years. The processing of GPS and GLONASS data is well established in CODE's ultra-rapid, rapid, and final product lines. In 2012 the IGS started its "Multi GNSS EXperiment" (MGEX). Meanwhile (status end of 2012) about 50 new or upgraded MGEX tracking stations offer their data to the user community meeting the IGS standards (e.g., correct equipment information, calibrated antennas, RINEX data format). MGEX supports the RINEX3 data format, new signal types for the established GNSS (e.g., L5 for GPS), and new GNSS, such as Galileo, Compass, and QZSS. It is therefore well suited as a testbed for future developments in GNSS processing. CODE supports MGEX by providing a three-system orbit solution (GPS+GLONASS+Galileo) on a non-operational basis. The CODE MGEX products are freely available at ftp://cddis.gsfc.nasa.gov/gnss/products/mgex (solution ID "com" stands for CODE-MGEX). The current status of the MGEX processing at CODE will be presented focusing on the consistency of GNSS-derived results based on different frequencies/signals. An outlook about CODE's future multi-GNSS activities will be given.

  2. Adaptive entropy coded subband coding of images.

    PubMed

    Kim, Y H; Modestino, J W

    1992-01-01

    The authors describe a design approach, called 2-D entropy-constrained subband coding (ECSBC), based upon recently developed 2-D entropy-constrained vector quantization (ECVQ) schemes. The output indexes of the embedded quantizers are further compressed by use of noiseless entropy coding schemes, such as Huffman or arithmetic codes, resulting in variable-rate outputs. Depending upon the specific configurations of the ECVQ and the ECPVQ over the subbands, many different types of SBC schemes can be derived within the generic 2-D ECSBC framework. Among these, the authors concentrate on three representative types of 2-D ECSBC schemes and provide relative performance evaluations. They also describe an adaptive buffer instrumented version of 2-D ECSBC, called 2-D ECSBC/AEC, for use with fixed-rate channels which completely eliminates buffer overflow/underflow problems. This adaptive scheme achieves performance quite close to the corresponding ideal 2-D ECSBC system. PMID:18296138

  3. FLOWTRAN-TF v1.2 source code

    SciTech Connect

    Aleman, S.E.; Cooper, R.E.; Flach, G.P.; Hamm, L.L.; Lee, S.; Smith, F.G. III

    1993-02-01

    The FLOWTRAN-TF code development effort was initiated in early 1989 as a code to monitor production reactor cooling systems at the Savannah River Plant. This report is a documentation of the various codes that make up FLOWTRAN-TF.

  4. FLOWTRAN-TF v1. 2 source code

    SciTech Connect

    Aleman, S.E.; Cooper, R.E.; Flach, G.P.; Hamm, L.L.; Lee, S.; Smith, F.G. III.

    1993-02-01

    The FLOWTRAN-TF code development effort was initiated in early 1989 as a code to monitor production reactor cooling systems at the Savannah River Plant. This report is a documentation of the various codes that make up FLOWTRAN-TF.

  5. 21 CFR 201.25 - Bar code label requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... drug product from the bar code label requirements set forth in this section. The exemption request must... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Bar code label requirements. 201.25 Section 201.25...: GENERAL LABELING General Labeling Provisions § 201.25 Bar code label requirements. (a) Who is subject...

  6. 21 CFR 201.25 - Bar code label requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... drug product from the bar code label requirements set forth in this section. The exemption request must... 21 Food and Drugs 4 2014-04-01 2014-04-01 false Bar code label requirements. 201.25 Section 201.25...: GENERAL LABELING General Labeling Provisions § 201.25 Bar code label requirements. (a) Who is subject...

  7. 21 CFR 201.25 - Bar code label requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... drug product from the bar code label requirements set forth in this section. The exemption request must... 21 Food and Drugs 4 2011-04-01 2011-04-01 false Bar code label requirements. 201.25 Section 201.25...: GENERAL LABELING General Labeling Provisions § 201.25 Bar code label requirements. (a) Who is subject...

  8. 21 CFR 201.25 - Bar code label requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... drug product from the bar code label requirements set forth in this section. The exemption request must... 21 Food and Drugs 4 2013-04-01 2013-04-01 false Bar code label requirements. 201.25 Section 201.25...: GENERAL LABELING General Labeling Provisions § 201.25 Bar code label requirements. (a) Who is subject...

  9. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  10. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  11. Updating the Read Codes

    PubMed Central

    Robinson, David; Comp, Dip; Schulz, Erich; Brown, Philip; Price, Colin

    1997-01-01

    Abstract The Read Codes are a hierarchically-arranged controlled clinical vocabulary introduced in the early 1980s and now consisting of three maintained versions of differing complexity. The code sets are dynamic, and are updated quarterly in response to requests from users including clinicians in both primary and secondary care, software suppliers, and advice from a network of specialist healthcare professionals. The codes' continual evolution of content, both across and within versions, highlights tensions between different users and uses of coded clinical data. Internal processes, external interactions and new structural features implemented by the NHS Centre for Coding and Classification (NHSCCC) for user interactive maintenance of the Read Codes are described, and over 2000 items of user feedback episodes received over a 15-month period are analysed. PMID:9391934

  12. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  13. Doubled Color Codes

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  14. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  15. Phonological coding during reading.

    PubMed

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. PMID:25150679

  16. Bar Code Labels

    NASA Technical Reports Server (NTRS)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  17. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  18. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  19. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  20. Expander chunked codes

    NASA Astrophysics Data System (ADS)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  1. General Point-Depletion and Fission Product Code System and Four-Group Fission Product Neutron Absorption Chain Data Library Generated from ENDF/B-IV for Thermal Reactors

    Energy Science and Technology Software Center (ESTSC)

    1981-12-01

    EPRI-CINDER calculates, for any specified initial fuel (actinide) description and flux or power history, the fuel and fission-product nuclide concentrations and associated properties. Other nuclide chains can also be computed with user-supplied libraries. The EPRI-CINDER Data Library (incorporating ENDF/B-IV fission-product processed 4-group cross sections, decay constants, absorption and decay branching fractions, and effective fission yields) is used in each constant-flux time step calculation and in time step summaries of nuclide decay rates and macroscopic absorptionmore » and barns-per-fission (b/f) absorption cross sections (by neutron group). User-supplied nuclide decay energy and multigroup-spectra data libraries may be attached to permit decay heating and decay-spectra calculations. An additional 12-chain library, explicitly including 27 major fission-product neutron absorbers and 4 fictitious nuclides, may be used to accurately calculate the aggregate macroscopic absorption buildup in fission products.« less

  2. A Method to Site-Specifically Identify and Quantitate Carbonyl End Products of Protein Oxidation Using Oxidation-Dependent Element Coded Affinity Tags (O-ECAT) and NanoLiquid Chromatography Fourier Transform Mass Spectrometry

    SciTech Connect

    Lee, S; Young, N L; Whetstone, P A; Cheal, S M; Benner, W H; Lebrilla, C B; Meares, C F

    2005-08-25

    Protein oxidation is linked to cellular stress, aging, and disease. Protein oxidations that result in reactive species are of particular interest, since these reactive oxidation products may react with other proteins or biomolecules in an unmediated and irreversible fashion, providing a potential marker for a variety of disease mechanisms. We have developed a novel system to identify and quantitate, relative to other states, the sites of oxidation on a given protein. A specially designed Oxidation-dependent carbonyl-specific Element-Coded Affinity Mass Tag (O-ECAT), AOD, ((S)-2-(4-(2-aminooxy)-acetamido)-benzyl)-1, 4, 7, 10-tetraazacyclododecane-N, N', N'', N'''-tetraacetic acid, is used to covalently tag the residues of a protein oxidized to aldehyde or keto end products. After proteolysis, the resulting AOD-tagged peptides are affinity purified, and analyzed by nanoLC-FTICR-MS, which provides high specificity in extracting co-eluting AOD mass pairs with a unique mass difference and affords relative quantitation based on isotopic ratios. Using this methodology, we have mapped the surface oxidation sites on a model protein, recombinant human serum albumin (rHSA) in its native form (as purchased) and after FeEDTA oxidation. A variety of modified amino acid residues including lysine, arginine, proline, histidine, threonine, aspartic and glutamic acids, were found to be oxidized to aldehyde and keto end products. The sensitivity of this methodology is shown by the number of peptides identified, twenty peptides on the native protein and twenty-nine after surface oxidation using FeEDTA and ascorbate. All identified peptides map to the surface of the HSA crystal structure validating this method for identifying oxidized amino acids on protein surfaces. In relative quantitation experiments between FeEDTA oxidation and native protein oxidation, identified sites showed different relative propensities towards oxidation independent of amino acid residue. We expect to extend

  3. Trellis-coded multidimensional phase modulation

    NASA Technical Reports Server (NTRS)

    Pietrobon, Steven S.; Deng, Robert H.; Lafanechere, Alain; Ungerboeck, Gottfried; Costello, Daniel J., Jr.

    1990-01-01

    A 2L-dimensional multiple phase-shift keyed (MPSK) (L x MPSK) signal set is obtained by forming the Cartesian product of L two-dimensional MPSK signal sets. A systematic approach to partitioning L x MPSK signal sets that is based on block coding is used. An encoder system approach is developed. It incorporates the design of a differential precoder, a systematic convolutional encoder, and a signal set mapper. Trellis-coded L x 4PSK, L x 8PSK, and L x 16PSK modulation schemes are found for L = 1-4 and a variety of code rates and decoder complexities, many of which are fully transparent to discrete phase rotations of the signal set. The new codes achieve asymptotic coding gains up to 5.85 dB.

  4. Research on Universal Combinatorial Coding

    PubMed Central

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  5. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  6. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  7. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  8. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  9. Codes of Conduct

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  10. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  11. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  12. Modified JPEG Huffman coding.

    PubMed

    Lakhani, Gopal

    2003-01-01

    It is a well observed characteristic that when a DCT block is traversed in the zigzag order, the AC coefficients generally decrease in size and the run-length of zero coefficients increase in number. This article presents a minor modification to the Huffman coding of the JPEG baseline compression algorithm to exploit this redundancy. For this purpose, DCT blocks are divided into bands so that each band can be coded using a separate code table. Three implementations are presented, which all move the end-of-block marker up in the middle of DCT block and use it to indicate the band boundaries. Experimental results are presented to compare reduction in the code size obtained by our methods with the JPEG sequential-mode Huffman coding and arithmetic coding methods. The average code reduction to the total image code size of one of our methods is 4%. Our methods can also be used for progressive image transmission and hence, experimental results are also given to compare them with two-, three-, and four-band implementations of the JPEG spectral selection method. PMID:18237897

  13. Binary concatenated coding system

    NASA Technical Reports Server (NTRS)

    Monford, L. G., Jr.

    1973-01-01

    Coding, using 3-bit binary words, is applicable to any measurement having integer scale up to 100. System using 6-bit data words can be expanded to read from 1 to 10,000, and 9-bit data words can increase range to 1,000,000. Code may be ''read'' directly by observation after memorizing simple listing of 9's and 10's.

  14. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses. PMID:3354937

  15. Coding for optical channels

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.; Mceliece, R. J.; Rumsey, H., Jr.

    1979-01-01

    In a previous paper Pierce considered the problem of optical communication from a novel viewpoint, and concluded that performance will likely be limited by issues of coding complexity rather than by thermal noise. This paper reviews the model proposed by Pierce and presents some results on the analysis and design of codes for this application.

  16. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  17. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  18. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  19. Statistical mechanics of error-correcting codes

    NASA Astrophysics Data System (ADS)

    Kabashima, Y.; Saad, D.

    1999-01-01

    We investigate the performance of error-correcting codes, where the code word comprises products of K bits selected from the original message and decoding is carried out utilizing a connectivity tensor with C connections per index. Shannon's bound for the channel capacity is recovered for large K and zero temperature when the code rate K/C is finite. Close to optimal error-correcting capability is obtained for finite K and C. We examine the finite-temperature case to assess the use of simulated annealing for decoding and extend the analysis to accommodate other types of noisy channels.

  20. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  1. Huffman coding in advanced audio coding standard

    NASA Astrophysics Data System (ADS)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  2. Coded aperture computed tomography

    NASA Astrophysics Data System (ADS)

    Choi, Kerkil; Brady, David J.

    2009-08-01

    Diverse physical measurements can be modeled by X-ray transforms. While X-ray tomography is the canonical example, reference structure tomography (RST) and coded aperture snapshot spectral imaging (CASSI) are examples of physically unrelated but mathematically equivalent sensor systems. Historically, most x-ray transform based systems sample continuous distributions and apply analytical inversion processes. On the other hand, RST and CASSI generate discrete multiplexed measurements implemented with coded apertures. This multiplexing of coded measurements allows for compression of measurements from a compressed sensing perspective. Compressed sensing (CS) is a revelation that if the object has a sparse representation in some basis, then a certain number, but typically much less than what is prescribed by Shannon's sampling rate, of random projections captures enough information for a highly accurate reconstruction of the object. This paper investigates the role of coded apertures in x-ray transform measurement systems (XTMs) in terms of data efficiency and reconstruction fidelity from a CS perspective. To conduct this, we construct a unified analysis using RST and CASSI measurement models. Also, we propose a novel compressive x-ray tomography measurement scheme which also exploits coding and multiplexing, and hence shares the analysis of the other two XTMs. Using this analysis, we perform a qualitative study on how coded apertures can be exploited to implement physical random projections by "regularizing" the measurement systems. Numerical studies and simulation results demonstrate several examples of the impact of coding.

  3. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  4. S.32: A Bill to amend the Internal Revenue Code of 1986 to provide a tax credit for the production of oil and gas from existing marginal oil and gas wells and from new oil and gas wells. Introduced in the Senate of the United States, One Hundred Fourth Congress, First session

    SciTech Connect

    1995-12-31

    This bill would establish tax credits for the production of oil and natural gas from existing marginal oil or gas wells, and from new oil and gas wells. It does so by adding a section to the Internal Revenue Code of 1986 which spells out the rules, the credit amounts, the scope of the terms used to define such facilities, and other rules.

  5. TRANSF code user manual

    SciTech Connect

    Weaver, H.J.

    1981-11-01

    The TRANSF code is a semi-interactive FORTRAN IV program which is designed to calculate the model parameters of a (structural) system by performing a least square parameter fit to measured transfer function data. The code is available at LLNL on both the 7600 and the Cray machines. The transfer function data to be fit is read into the code via a disk file. The primary mode of output is FR80 graphics, although, it is also possible to have results written to either the TTY or to a disk file.

  6. FORTRAN code-evaluation system

    NASA Technical Reports Server (NTRS)

    Capps, J. D.; Kleir, R.

    1977-01-01

    Automated code evaluation system can be used to detect coding errors and unsound coding practices in any ANSI FORTRAN IV source code before they can cause execution-time malfunctions. System concentrates on acceptable FORTRAN code features which are likely to produce undesirable results.

  7. FAST2 Code validation

    SciTech Connect

    Wilson, R.E.; Freeman, L.N.; Walker, S.N.

    1995-09-01

    The FAST2 Code which is capable of determining structural loads of a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data at two wind speeds for the ESI-80 are given. The FAST2 Code models a two-bladed HAWT with degrees of freedom for blade flap, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffness, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms and azimuth averaged bin plots. It is concluded that agreement between the FAST2 Code and test results is good.

  8. Compressible Astrophysics Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  9. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  10. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  11. Future trends in image coding

    NASA Astrophysics Data System (ADS)

    Habibi, Ali

    1993-01-01

    The objective of this article is to present a discussion on the future of image data compression in the next two decades. It is virtually impossible to predict with any degree of certainty the breakthroughs in theory and developments, the milestones in advancement of technology and the success of the upcoming commercial products in the market place which will be the main factors in establishing the future stage to image coding. What we propose to do, instead, is look back at the progress in image coding during the last two decades and assess the state of the art in image coding today. Then, by observing the trends in developments of theory, software, and hardware coupled with the future needs for use and dissemination of imagery data and the constraints on the bandwidth and capacity of various networks, predict the future state of image coding. What seems to be certain today is the growing need for bandwidth compression. The television is using a technology which is half a century old and is ready to be replaced by high definition television with an extremely high digital bandwidth. Smart telephones coupled with personal computers and TV monitors accommodating both printed and video data will be common in homes and businesses within the next decade. Efficient and compact digital processing modules using developing technologies will make bandwidth compressed imagery the cheap and preferred alternative in satellite and on-board applications. In view of the above needs, we expect increased activities in development of theory, software, special purpose chips and hardware for image bandwidth compression in the next two decades. The following sections summarize the future trends in these areas.

  12. SLINGSHOT - a Coilgun Design Code

    SciTech Connect

    MARDER, BARRY M.

    2001-09-01

    The Sandia coilgun [1,2,3,4,5] is an inductive electromagnetic launcher. It consists of a sequence of powered, multi-turn coils surrounding a flyway of circular cross-section through which a conducting armature passes. When the armature is properly positioned with respect to a coil, a charged capacitor is switched into the coil circuit. The rising coil currents induce a current in the armature, producing a repulsive accelerating force. The basic numerical tool for modeling the coilgun is the SLINGSHOT code, an expanded, user-friendly successor to WARP-10 [6]. SLINGSHOT computes the currents in the coils and armature, finds the forces produced by those currents, and moves the armature through the array of coils. In this approach, the cylindrically symmetric coils and armature are subdivided into concentric hoops with rectangular cross-section, in each of which the current is assumed to be uniform. The ensemble of hoops are treated as coupled circuits. The specific heats and resistivities of the hoops are found as functions of temperature and used to determine the resistive heating. The code calculates the resistances and inductances for all hoops, and the mutual inductances for all hoop pairs. Using these, it computes the hoop currents from their circuit equations, finds the forces from the products of these currents and the mutual inductance gradient, and moves the armature. Treating the problem as a set of coupled circuits is a fast and accurate approach compared to solving the field equations. Its use, however, is restricted to problems in which the symmetry dictates the current paths. This paper is divided into three parts. The first presents a demonstration of the code. The second describes the input and output. The third part describes the physical models and numerical methods used in the code. It is assumed that the reader is familiar with coilguns.

  13. Results and code predictions for ABCOVE (aerosol behavior code validation and evaluation) aerosol code validation: Test AB6 with two aerosol species. [LMFBR

    SciTech Connect

    Hilliard, R K; McCormack, J C; Muhlestein, L D

    1984-12-01

    A program for aerosol behavior code validation and evaluation (ABCOVE) has been developed in accordance with the LMFBR Safety Program Plan. The ABCOVE program is a cooperative effort between the USDOE, the USNRC, and their contractor organizations currently involved in aerosol code development, testing or application. The second large-scale test in the ABCOVE program, AB6, was performed in the 850-m/sup 3/ CSTF vessel with a two-species test aerosol. The test conditions simulated the release of a fission product aerosol, NaI, in the presence of a sodium spray fire. Five organizations made pretest predictions of aerosol behavior using seven computer codes. Three of the codes (QUICKM, MAEROS and CONTAIN) were discrete, multiple species codes, while four (HAA-3, HAA-4, HAARM-3 and SOFIA) were log-normal codes which assume uniform coagglomeration of different aerosol species. Detailed test results are presented and compared with the code predictions for seven key aerosol behavior parameters.

  14. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  15. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  16. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  17. Error coding simulations

    NASA Astrophysics Data System (ADS)

    Noble, Viveca K.

    1993-11-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  18. Parallel solid mechanics codes at Sandia National Laboratories

    SciTech Connect

    McGlaun, M.

    1994-08-01

    Computational physicists at Sandia National Laboratories have moved their production codes to distributed memory parallel computers. The codes include the multi-material CTH Eulerian code, structural mechanics code. This presentation discusses our experiences moving the codes to parallel computers and experiences running the codes. Moving large production codes onto parallel computers require developing parallel algorithms, parallel data bases and parallel support tools. We rewrote the Eulerian CTH code for parallel computers. We were able to move both ALEGRA and PRONTO to parallel computers with only a modest number of modifications. We restructured the restart and graphics data bases to make them parallel and minimize the I/O to the parallel computer. We developed mesh decomposition tools to divide a rectangular or arbitrary connectivity mesh into sub-meshes. The sub-meshes map to processors and minimize the communication between processors. We developed new visualization tools to process the very large, parallel data bases. This presentation also discusses our experiences running these codes on Sandia`s 1840 compute node Intel Paragon, 1024 processor nCUBE and networked workstations. The parallel version of CTH uses the Paragon and nCUBE for production calculations. The ALEGRA and PRONTO codes are moving off networked workstations onto the Paragon and nCUBE massively parallel computers.

  19. Phase-coded pulse aperiodic transmitter coding

    NASA Astrophysics Data System (ADS)

    Virtanen, I. I.; Vierinen, J.; Lehtinen, M. S.

    2009-07-01

    Both ionospheric and weather radar communities have already adopted the method of transmitting radar pulses in an aperiodic manner when measuring moderately overspread targets. Among the users of the ionospheric radars, this method is called Aperiodic Transmitter Coding (ATC), whereas the weather radar users have adopted the term Simultaneous Multiple Pulse-Repetition Frequency (SMPRF). When probing the ionosphere at the carrier frequencies of the EISCAT Incoherent Scatter Radar facilities, the range extent of the detectable target is typically of the order of one thousand kilometers - about seven milliseconds - whereas the characteristic correlation time of the scattered signal varies from a few milliseconds in the D-region to only tens of microseconds in the F-region. If one is interested in estimating the scattering autocorrelation function (ACF) at time lags shorter than the F-region correlation time, the D-region must be considered as a moderately overspread target, whereas the F-region is a severely overspread one. Given the technical restrictions of the radar hardware, a combination of ATC and phase-coded long pulses is advantageous for this kind of target. We evaluate such an experiment under infinitely low signal-to-noise ratio (SNR) conditions using lag profile inversion. In addition, a qualitative evaluation under high-SNR conditions is performed by analysing simulated data. The results show that an acceptable estimation accuracy and a very good lag resolution in the D-region can be achieved with a pulse length long enough for simultaneous E- and F-region measurements with a reasonable lag extent. The new experiment design is tested with the EISCAT Tromsø VHF (224 MHz) radar. An example of a full D/E/F-region ACF from the test run is shown at the end of the paper.

  20. FAA Smoke Transport Code

    Energy Science and Technology Software Center (ESTSC)

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a codemore » obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.« less

  1. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  2. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  3. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  4. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  5. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  6. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  7. Code of Ethics.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC.

    The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

  8. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  9. Electrical Circuit Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  10. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  11. Environmental Fluid Dynamics Code

    EPA Science Inventory

    The Environmental Fluid Dynamics Code (EFDC)is a state-of-the-art hydrodynamic model that can be used to simulate aquatic systems in one, two, and three dimensions. It has evolved over the past two decades to become one of the most widely used and technically defensible hydrodyn...

  12. Heuristic dynamic complexity coding

    NASA Astrophysics Data System (ADS)

    Škorupa, Jozef; Slowack, Jürgen; Mys, Stefaan; Lambert, Peter; Van de Walle, Rik

    2008-04-01

    Distributed video coding is a new video coding paradigm that shifts the computational intensive motion estimation from encoder to decoder. This results in a lightweight encoder and a complex decoder, as opposed to the predictive video coding scheme (e.g., MPEG-X and H.26X) with a complex encoder and a lightweight decoder. Both schemas, however, do not have the ability to adapt to varying complexity constraints imposed by encoder and decoder, which is an essential ability for applications targeting a wide range of devices with different complexity constraints or applications with temporary variable complexity constraints. Moreover, the effect of complexity adaptation on the overall compression performance is of great importance and has not yet been investigated. To address this need, we have developed a video coding system with the possibility to adapt itself to complexity constraints by dynamically sharing the motion estimation computations between both components. On this system we have studied the effect of the complexity distribution on the compression performance. This paper describes how motion estimation can be shared using heuristic dynamic complexity and how distribution of complexity affects the overall compression performance of the system. The results show that the complexity can indeed be shared between encoder and decoder in an efficient way at acceptable rate-distortion performance.

  13. Code of Ethics.

    ERIC Educational Resources Information Center

    Association of College Unions-International, Bloomington, IN.

    The code of ethics for the college union and student activities professional is presented by the Association of College Unions-International. The preamble identifies the objectives of the college union as providing campus community centers and social programs that enhance the quality of life for members of the academic community. Ethics for…

  14. Dual Coding in Children.

    ERIC Educational Resources Information Center

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  15. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  16. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  17. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  18. User's manual for the GABAS spectrum computer code. Final report

    SciTech Connect

    Thayer, D.D.; Lurie, N.A.

    1982-01-01

    The Gamma and Beta Spectrum computer code (GABAS) was developed at IRT Corporation for calculating time-dependent beta and/or gamma spectra from decaying fission products. GABAS calculates composite fission product spectra based on the technique used by England, et al., in conjunction with the CINDER family of fission product codes. Multigroup beta and gamma spectra for individual nuclides are folded with their corresponding time-dependent activities (usually generated by a fission product inventory code) to produce a composite time-dependent fission product spectrum. This manual contains the methodology employed by GABAS, input requirements for proper execution, a sample problem and a FORTRAN listing compatible with a UNIVAC machine. The code is available in a UNIVAC 1100/81 version and a VAX 11/780 version. The former may be obtained from the Radiation Shielding Information Center (RSIC); the latter may be obtained directly from IRT Corporation.

  19. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  20. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  1. Interface requirements to couple thermal-hydraulic codes to severe accident codes: ATHLET-CD

    SciTech Connect

    Trambauer, K.

    1997-07-01

    The system code ATHLET-CD is being developed by GRS in cooperation with IKE and IPSN. Its field of application comprises the whole spectrum of leaks and large breaks, as well as operational and abnormal transients for LWRs and VVERs. At present the analyses cover the in-vessel thermal-hydraulics, the early phases of core degradation, as well as fission products and aerosol release from the core and their transport in the Reactor Coolant System. The aim of the code development is to extend the simulation of core degradation up to failure of the reactor pressure vessel and to cover all physically reasonable accident sequences for western and eastern LWRs including RMBKs. The ATHLET-CD structure is highly modular in order to include a manifold spectrum of models and to offer an optimum basis for further development. The code consists of four general modules to describe the reactor coolant system thermal-hydraulics, the core degradation, the fission product core release, and fission product and aerosol transport. Each general module consists of some basic modules which correspond to the process to be simulated or to its specific purpose. Besides the code structure based on the physical modelling, the code follows four strictly separated steps during the course of a calculation: (1) input of structure, geometrical data, initial and boundary condition, (2) initialization of derived quantities, (3) steady state calculation or input of restart data, and (4) transient calculation. In this paper, the transient solution method is briefly presented and the coupling methods are discussed. Three aspects have to be considered for the coupling of different modules in one code system. First is the conservation of masses and energy in the different subsystems as there are fluid, structures, and fission products and aerosols. Second is the convergence of the numerical solution and stability of the calculation. The third aspect is related to the code performance, and running time.

  2. 21 CFR 610.67 - Bar code label requirements.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 7 2010-04-01 2010-04-01 false Bar code label requirements. 610.67 Section 610.67 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS GENERAL BIOLOGICAL PRODUCTS STANDARDS Labeling Standards § 610.67 Bar code label...

  3. 21 CFR 610.67 - Bar code label requirements.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 7 2012-04-01 2012-04-01 false Bar code label requirements. 610.67 Section 610.67 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS GENERAL BIOLOGICAL PRODUCTS STANDARDS Labeling Standards § 610.67 Bar code label...

  4. 21 CFR 610.67 - Bar code label requirements.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 7 2014-04-01 2014-04-01 false Bar code label requirements. 610.67 Section 610.67 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS GENERAL BIOLOGICAL PRODUCTS STANDARDS Labeling Standards § 610.67 Bar code label...

  5. 21 CFR 610.67 - Bar code label requirements.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 7 2011-04-01 2010-04-01 true Bar code label requirements. 610.67 Section 610.67 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS GENERAL BIOLOGICAL PRODUCTS STANDARDS Labeling Standards § 610.67 Bar code label...

  6. 21 CFR 610.67 - Bar code label requirements.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 7 2013-04-01 2013-04-01 false Bar code label requirements. 610.67 Section 610.67 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) BIOLOGICS GENERAL BIOLOGICAL PRODUCTS STANDARDS Labeling Standards § 610.67 Bar code label...

  7. 27 CFR 20.135 - State code numbers.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false State code numbers. 20.135... Articles § 20.135 State code numbers. In showing the permit number on labels as provided in § 20.134(b)(2)(ii), the permittee who distributes the article may substitute the appropriate number shown below...

  8. Terrain-Responsive Atmospheric Code

    Energy Science and Technology Software Center (ESTSC)

    1991-11-20

    The Terrain-Responsive Atmospheric Code (TRAC) is a real-time emergency response modeling capability designed to advise Emergency Managers of the path, timing, and projected impacts from an atmospheric release. TRAC evaluates the effects of both radiological and non-radiological hazardous substances, gases and particulates. Using available surface and upper air meteorological information, TRAC realistically treats complex sources and atmospheric conditions, such as those found in mountainous terrain. TRAC calculates atmospheric concentration, deposition, and dose for more thanmore » 25,000 receptor locations within 80 km of the release point. Human-engineered output products support critical decisions on the type, location, and timing of protective actions for workers and the public during an emergency.« less

  9. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  10. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operationmore » of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.« less

  11. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  12. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  13. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  14. VAC: Versatile Advection Code

    NASA Astrophysics Data System (ADS)

    Tóth, Gábor; Keppens, Rony

    2012-07-01

    The Versatile Advection Code (VAC) is a freely available general hydrodynamic and magnetohydrodynamic simulation software that works in 1, 2 or 3 dimensions on Cartesian and logically Cartesian grids. VAC runs on any Unix/Linux system with a Fortran 90 (or 77) compiler and Perl interpreter. VAC can run on parallel machines using either the Message Passing Interface (MPI) library or a High Performance Fortran (HPF) compiler.

  15. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  16. Evaluation of the efficiency and fault density of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  17. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  18. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  19. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  20. Structural coding versus free-energy predictive coding.

    PubMed

    van der Helm, Peter A

    2016-06-01

    Focusing on visual perceptual organization, this article contrasts the free-energy (FE) version of predictive coding (a recent Bayesian approach) to structural coding (a long-standing representational approach). Both use free-energy minimization as metaphor for processing in the brain, but their formal elaborations of this metaphor are fundamentally different. FE predictive coding formalizes it by minimization of prediction errors, whereas structural coding formalizes it by minimization of the descriptive complexity of predictions. Here, both sides are evaluated. A conclusion regarding competence is that FE predictive coding uses a powerful modeling technique, but that structural coding has more explanatory power. A conclusion regarding performance is that FE predictive coding-though more detailed in its account of neurophysiological data-provides a less compelling cognitive architecture than that of structural coding, which, for instance, supplies formal support for the computationally powerful role it attributes to neuronal synchronization. PMID:26407895

  1. User Instructions for the CiderF Individual Dose Code and Associated Utility Codes

    SciTech Connect

    Eslinger, Paul W.; Napier, Bruce A.

    2013-08-30

    Historical activities at facilities producing nuclear materials for weapons released radioactivity into the air and water. Past studies in the United States have evaluated the release, atmospheric transport and environmental accumulation of 131I from the nuclear facilities at Hanford in Washington State and the resulting dose to members of the public (Farris et al. 1994). A multi-year dose reconstruction effort (Mokrov et al. 2004) is also being conducted to produce representative dose estimates for members of the public living near Mayak, Russia, from atmospheric releases of 131I at the facilities of the Mayak Production Association. The approach to calculating individual doses to members of the public from historical releases of airborne 131I has the following general steps: • Construct estimates of releases 131I to the air from production facilities. • Model the transport of 131I in the air and subsequent deposition on the ground and vegetation. • Model the accumulation of 131I in soil, water and food products (environmental media). • Calculate the dose for an individual by matching the appropriate lifestyle and consumption data for the individual to the concentrations of 131I in environmental media at their residence location. A number of computer codes were developed to facilitate the study of airborne 131I emissions at Hanford. The RATCHET code modeled movement of 131I in the atmosphere (Ramsdell Jr. et al. 1994). The DECARTES code modeled accumulation of 131I in environmental media (Miley et al. 1994). The CIDER computer code estimated annual doses to individuals (Eslinger et al. 1994) using the equations and parameters specific to Hanford (Snyder et al. 1994). Several of the computer codes developed to model 131I releases from Hanford are general enough to be used for other facilities. This document provides user instructions for computer codes calculating doses to members of the public from atmospheric 131I that have two major differences from the

  2. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  3. Energy Code Enforcement Training Manual : Covering the Washington State Energy Code and the Ventilation and Indoor Air Quality Code.

    SciTech Connect

    Washington State Energy Code Program

    1992-05-01

    This manual is designed to provide building department personnel with specific inspection and plan review skills and information on provisions of the 1991 edition of the Washington State Energy Code (WSEC). It also provides information on provisions of the new stand-alone Ventilation and Indoor Air Quality (VIAQ) Code.The intent of the WSEC is to reduce the amount of energy used by requiring energy-efficient construction. Such conservation reduces energy requirements, and, as a result, reduces the use of finite resources, such as gas or oil. Lowering energy demand helps everyone by keeping electricity costs down. (It is less expensive to use existing electrical capacity efficiently than it is to develop new and additional capacity needed to heat or cool inefficient buildings.) The new VIAQ Code (effective July, 1991) is a natural companion to the energy code. Whether energy-efficient or not, an homes have potential indoor air quality problems. Studies have shown that indoor air is often more polluted than outdoor air. The VIAQ Code provides a means of exchanging stale air for fresh, without compromising energy savings, by setting standards for a controlled ventilation system. It also offers requirements meant to prevent indoor air pollution from building products or radon.

  4. Cracking the code: the genesis, use and future of the Code of Conduct.

    PubMed

    Walker, Peter

    2005-12-01

    This paper reflects on the genesis of the Code of Conduct for the International Red Cross and Red Crescent Movement and Non-Governmental Organizations (NGOs) in Disaster Relief, on the tenth anniversary of its adoption. The origins, usage and future of the code are examined with respect to three debates, current at the time of its inception, namely: the debate about the core content of humanitarianism; the debate about coherence and the consensual nature of the humanitarian community; and the debate about the need for, and the ability to demonstrate, accountability. The paper concludes that although the Code of Conduct was very much a product of its time, its content remains relevant today. However, its future application hinges on the capacity of those who purport to follow it to realise true accountability, and on proving that the code, written essentially for natural disasters, is relevant to contemporary complex emergencies. PMID:16277643

  5. Development of Tritium Permeation Analysis Code (TPAC)

    SciTech Connect

    Eung S. Kim; Chang H. Oh; Mike Patterson

    2010-10-01

    Idaho National Laboratory developed the Tritium Permeation Analysis Code (TPAC) for tritium permeation in the Very High Temperature Gas Cooled Reactor (VHTR). All the component models in the VHTR were developed and were embedded into the MATHLAB SIMULINK package with a Graphic User Interface. The governing equations of the nuclear ternary reaction and thermal neutron capture reactions from impurities in helium and graphite core, reflector, and control rods were implemented. The TPAC code was verified using analytical solutions for the tritium birth rate from the ternary fission, the birth rate from 3He, and the birth rate from 10B. This paper also provides comparisons of the TPAC with the existing other codes. A VHTR reference design was selected for tritium permeation study from the reference design to the nuclear-assisted hydrogen production plant and some sensitivity study results are presented based on the HTGR outlet temperature of 750 degrees C.

  6. Validation of comprehensive space radiation transport code

    SciTech Connect

    Shinn, J.L.; Simonsen, L.C.; Cucinotta, F.A.

    1998-12-01

    The HZETRN code has been developed over the past decade to evaluate the local radiation fields within sensitive materials on spacecraft in the space environment. Most of the more important nuclear and atomic processes are now modeled and evaluation within a complex spacecraft geometry with differing material components, including transition effects across boundaries of dissimilar materials, are included. The atomic/nuclear database and transport procedures have received limited validation in laboratory testing with high energy ion beams. The codes have been applied in design of the SAGE-III instrument resulting in material changes to control injurious neutron production, in the study of the Space Shuttle single event upsets, and in validation with space measurements (particle telescopes, tissue equivalent proportional counters, CR-39) on Shuttle and Mir. The present paper reviews the code development and presents recent results in laboratory and space flight validation.

  7. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli. PMID:23724797

  8. 21 CFR 206.10 - Code imprint required.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... identification than a symbol or logo by itself. Homeopathic drug products are required only to bear an imprint... National Drug Code, or a mark, symbol, logo, or monogram, or a combination of letters, numbers, and...

  9. 21 CFR 206.10 - Code imprint required.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... identification than a symbol or logo by itself. Homeopathic drug products are required only to bear an imprint... National Drug Code, or a mark, symbol, logo, or monogram, or a combination of letters, numbers, and...

  10. On lossless coding for HEVC

    NASA Astrophysics Data System (ADS)

    Gao, Wen; Jiang, Minqiang; Yu, Haoping

    2013-02-01

    In this paper, we first review the lossless coding mode in the version 1 of the HEVC standard that has recently finalized. We then provide a performance comparison between the lossless coding mode in the HEVC and MPEG-AVC/H.264 standards and show that the HEVC lossless coding has limited coding efficiency. To improve the performance of the lossless coding mode, several new coding tools that were contributed to JCT-VC but not adopted in version 1 of HEVC standard are introduced. In particular, we discuss sample based intra prediction and coding of residual coefficients in more detail. At the end, we briefly address a new class of coding tools, i.e., a dictionary-based coder, that is efficient in encoding screen content including graphics and text.

  11. Automated searching for quantum subsystem codes

    SciTech Connect

    Crosswhite, Gregory M.; Bacon, Dave

    2011-02-15

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  12. Bar code usage in nuclear materials accountability

    SciTech Connect

    Mee, W.T.

    1983-07-01

    The age old method of physically taking an inventory of materials by listing each item's identification number has lived beyond its usefulness. In this age of computerization, which offers the local grocery store a quick, sure, and easy means to inventory, it is time for nuclear materials facilities to automate accountability activities. The Oak Ridge Y-12 Plant began investigating the use of automated data collection devices in 1979. At that time, bar code and optical-character-recognition (OCR) systems were reviewed with the purpose of directly entering data into DYMCAS (Dynamic Special Nuclear Materials Control and Accountability System). Both of these systems appeared applicable; however, other automated devices already employed for production control made implementing the bar code and OCR seem improbable. However, the DYMCAS was placed on line for nuclear material accountability, a decision was made to consider the bar code for physical inventory listings. For the past several months a development program has been underway to use a bar code device to collect and input data to the DYMCAS on the uranium recovery operations. Programs have been completed and tested, and are being employed to ensure that data will be compatible and useful. Bar code implementation and expansion of its use for all nuclear material inventory activity in Y-12 is presented.

  13. Bar code usage in nuclear materials accountability

    SciTech Connect

    Mee, W.T.

    1983-01-01

    The Oak Ridge Y-12 Plant began investigating the use of automated data collection devices in 1979. At this time, bar code and optical-character-recognition (OCR) systems were reviewed with the purpose of directly entering data into DYMCAS (Dynamic Special Nuclear Materials Control and Accountability System). Both of these systems appeared applicable, however, other automated devices already employed for production control made implementing the bar code and OCR seem improbable. However, the DYMCAS was placed on line for nuclear material accountability, a decision was made to consider the bar code for physical inventory listings. For the past several months a development program has been underway to use a bar code device to collect and input data to the DYMCAS on the uranium recovery operations. Programs have been completed and tested, and are being employed to ensure that data will be compatible and useful. Bar code implementation and expansion of its use for all nuclear material inventory activity in Y-12 is presented.

  14. Code manual for CONTAIN 2.0: A computer code for nuclear reactor containment analysis

    SciTech Connect

    Murata, K.K.; Williams, D.C.; Griffith, R.O.; Gido, R.G.; Tadios, E.L.; Davis, F.J.; Martinez, G.M.; Washington, K.E.; Tills, J.

    1997-12-01

    The CONTAIN 2.0 computer code is an integrated analysis tool used for predicting the physical conditions, chemical compositions, and distributions of radiological materials inside a containment building following the release of material from the primary system in a light-water reactor accident. It can also predict the source term to the environment. CONTAIN 2.0 is intended to replace the earlier CONTAIN 1.12, which was released in 1991. The purpose of this Code Manual is to provide full documentation of the features and models in CONTAIN 2.0. Besides complete descriptions of the models, this Code Manual provides a complete description of the input and output from the code. CONTAIN 2.0 is a highly flexible and modular code that can run problems that are either quite simple or highly complex. An important aspect of CONTAIN is that the interactions among thermal-hydraulic phenomena, aerosol behavior, and fission product behavior are taken into account. The code includes atmospheric models for steam/air thermodynamics, intercell flows, condensation/evaporation on structures and aerosols, aerosol behavior, and gas combustion. It also includes models for reactor cavity phenomena such as core-concrete interactions and coolant pool boiling. Heat conduction in structures, fission product decay and transport, radioactive decay heating, and the thermal-hydraulic and fission product decontamination effects of engineered safety features are also modeled. To the extent possible, the best available models for severe accident phenomena have been incorporated into CONTAIN, but it is intrinsic to the nature of accident analysis that significant uncertainty exists regarding numerous phenomena. In those cases, sensitivity studies can be performed with CONTAIN by means of user-specified input parameters. Thus, the code can be viewed as a tool designed to assist the knowledge reactor safety analyst in evaluating the consequences of specific modeling assumptions.

  15. CESAR: A Code for Nuclear Fuel and Waste Characterisation

    SciTech Connect

    Vidal, J.M.; Grouiller, J.P.; Launay, A.; Berthion, Y.; Marc, A.; Toubon, H.

    2006-07-01

    CESAR (Simplified Evolution Code Applied to Reprocessing) is a depletion code developed through a joint program between CEA and COGEMA. In the late 1980's, the first use of this code dealt with nuclear measurement at the Laboratories of the La Hague reprocessing plant. The use of CESAR was then extended to characterizations of all entrance materials and for characterisation, via tracer, of all produced waste. The code can distinguish more than 100 heavy nuclides, 200 fission products and 100 activation products, and it can characterise both the fuel and the structural material of the fuel. CESAR can also make depletion calculations from 3 months to 1 million years of cooling time. Between 2003-2005, the 5. version of the code was developed. The modifications were related to the harmonisation of the code's nuclear data with the JEF2.2 nuclear data file. This paper describes the code and explains the extensive use of this code at the La Hague reprocessing plant and also for prospective studies. The second part focuses on the modifications of the latest version, and describes the application field and the qualification of the code. Many companies and the IAEA use CESAR today. CESAR offers a Graphical User Interface, which is very user-friendly. (authors)

  16. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study.

    PubMed

    Weems, Shelley; Heller, Pamela; Fenton, Susan H

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: 1. Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity. 2. Coder training and type of record (inpatient versus outpatient) affect coding productivity. 3. Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity. PMID

  17. Results from the Veterans Health Administration ICD-10-CM/PCS Coding Pilot Study

    PubMed Central

    Weems, Shelley; Heller, Pamela; Fenton, Susan H.

    2015-01-01

    The Veterans Health Administration (VHA) of the US Department of Veterans Affairs has been preparing for the October 1, 2015, conversion to the International Classification of Diseases, Tenth Revision, Clinical Modification and Procedural Coding System (ICD-10-CM/PCS) for more than four years. The VHA's Office of Informatics and Analytics ICD-10 Program Management Office established an ICD-10 Learning Lab to explore expected operational challenges. This study was conducted to determine the effects of the classification system conversion on coding productivity. ICD codes are integral to VHA business processes and are used for purposes such as clinical studies, performance measurement, workload capture, cost determination, Veterans Equitable Resource Allocation (VERA) determination, morbidity and mortality classification, indexing of hospital records by disease and operations, data storage and retrieval, research purposes, and reimbursement. The data collection for this study occurred in multiple VHA sites across several months using standardized methods. It is commonly accepted that coding productivity will decrease with the implementation of ICD-10-CM/PCS. The findings of this study suggest that the decrease will be more significant for inpatient coding productivity (64.5 percent productivity decrease) than for ambulatory care coding productivity (6.7 percent productivity decrease). This study reveals the following important points regarding ICD-10-CM/PCS coding productivity: Ambulatory care ICD-10-CM coding productivity is not expected to decrease as significantly as inpatient ICD-10-CM/PCS coding productivity.Coder training and type of record (inpatient versus outpatient) affect coding productivity.Inpatient coding productivity is decreased when a procedure requiring ICD-10-PCS coding is present. It is highly recommended that organizations perform their own analyses to determine the effects of ICD-10-CM/PCS implementation on coding productivity. PMID:26396553

  18. Recent improvements to the VDART3 VAWT code

    SciTech Connect

    Berg, D.E.

    1983-01-01

    VDART3, the three-dimensional vortex-filament based computer model of the curved-blade type Vertical Axis Wind Turbine (VAWT), models the turbine aerodynamic phenomena in far more detail than virtually any other VAWT code in use today. Consequently, it yields far more accurate calculations of turbine blade loading, turbine torque production and turbine near wake structure than the other VAWT codes. Recent work on the code includes the addition of a dynamic stall model, consideration of dynamic pitching and blade mounting location effects, and the development of several techniques for reducing execution time without sacrificing code accuracy. This work has resulted in far better agreement of code results with experimental data at low tip-speed ratios and in execution times that have been reduced by an order of magnitude with virtually no loss in code accuracy.

  19. Finite-connectivity systems as error-correcting codes

    NASA Astrophysics Data System (ADS)

    Vicente, Renato; Saad, David; Kabashima, Yoshiyuki

    1999-11-01

    We investigate the performance of parity check codes using the mapping onto Ising spin systems proposed by Sourlas [Nature (London) 339, 693 (1989); Europhys. Lett. 25, 159 (1994)]. We study codes where each parity check comprises products of K bits selected from the original digital message with exactly C checks per message bit. We show, using the replica method, that these codes saturate Shannon's coding bound for K-->∞ when the code rate K/C is finite. We then examine the finite temperature case to assess the use of simulated annealing methods for decoding, study the performance of the finite K case, and extend the analysis to accommodate different types of noisy channels. The connection between statistical physics and belief propagation decoders is discussed and the dynamics of the decoding itself is analyzed. Further insight into new approaches for improving the code performance is given.

  20. Noiseless Coding Of Magnetometer Signals

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1989-01-01

    Report discusses application of noiseless data-compression coding to digitized readings of spaceborne magnetometers for transmission back to Earth. Objective of such coding to increase efficiency by decreasing rate of transmission without sacrificing integrity of data. Adaptive coding compresses data by factors ranging from 2 to 6.

  1. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  2. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  3. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…

  4. Ethical Codes in the Professions.

    ERIC Educational Resources Information Center

    Schmeiser, Cynthia B.

    1992-01-01

    Whether the measurement profession should consider developing and adopting a code of professional conduct is explored after a brief review of existing references to standards of conduct and a review of other professional codes. Issues include the need for a code of ethics, its usefulness, and its enforcement. (SLD)

  5. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2005-05-07

    CONEX is a code for joining sequentially in time multiple exodusll database files which all represent the same base mesh topology and geometry. It is used to create a single results or restart file from multiple results or restart files which typically arise as the result of multiple restarted analyses. CONEX is used to postprocess the results from a series of finite element analyses. It can join sequentially the data from multiple results databases intomore » a single database which makes it easier to postprocess the results data.« less

  6. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2005-06-26

    Exotxt is an analysis code that reads finite element results data stored in an exodusII file and generates a file in a structured text format. The text file can be edited or modified via a number of text formatting tools. Exotxt is used by analysis to translate data from the binary exodusII format into a structured text format which can then be edited or modified and then either translated back to exodusII format or tomore » another format.« less

  7. Low Density Parity Check Codes: Bandwidth Efficient Channel Coding

    NASA Technical Reports Server (NTRS)

    Fong, Wai; Lin, Shu; Maki, Gary; Yeh, Pen-Shu

    2003-01-01

    Low Density Parity Check (LDPC) Codes provide near-Shannon Capacity performance for NASA Missions. These codes have high coding rates R=0.82 and 0.875 with moderate code lengths, n=4096 and 8176. Their decoders have inherently parallel structures which allows for high-speed implementation. Two codes based on Euclidean Geometry (EG) were selected for flight ASIC implementation. These codes are cyclic and quasi-cyclic in nature and therefore have a simple encoder structure. This results in power and size benefits. These codes also have a large minimum distance as much as d,,, = 65 giving them powerful error correcting capabilities and error floors less than lo- BER. This paper will present development of the LDPC flight encoder and decoder, its applications and status.

  8. New quantum codes constructed from quaternary BCH codes

    NASA Astrophysics Data System (ADS)

    Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena

    2016-07-01

    In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.

  9. Structured error recovery for code-word-stabilized quantum codes

    SciTech Connect

    Li Yunfan; Dumer, Ilya; Grassl, Markus; Pryadko, Leonid P.

    2010-05-15

    Code-word-stabilized (CWS) codes are, in general, nonadditive quantum codes that can correct errors by an exhaustive search of different error patterns, similar to the way that we decode classical nonlinear codes. For an n-qubit quantum code correcting errors on up to t qubits, this brute-force approach consecutively tests different errors of weight t or less and employs a separate n-qubit measurement in each test. In this article, we suggest an error grouping technique that allows one to simultaneously test large groups of errors in a single measurement. This structured error recovery technique exponentially reduces the number of measurements by about 3{sup t} times. While it still leaves exponentially many measurements for a generic CWS code, the technique is equivalent to syndrome-based recovery for the special case of additive CWS codes.

  10. Scalable video transmission over Rayleigh fading channels using LDPC codes

    NASA Astrophysics Data System (ADS)

    Bansal, Manu; Kondi, Lisimachos P.

    2005-03-01

    In this paper, we investigate an important problem of efficiently utilizing the available resources for video transmission over wireless channels while maintaining a good decoded video quality and resilience to channel impairments. Our system consists of the video codec based on 3-D set partitioning in hierarchical trees (3-D SPIHT) algorithm and employs two different schemes using low-density parity check (LDPC) codes for channel error protection. The first method uses the serial concatenation of the constant-rate LDPC code and rate-compatible punctured convolutional (RCPC) codes. Cyclic redundancy check (CRC) is used to detect transmission errors. In the other scheme, we use the product code structure consisting of a constant rate LDPC/CRC code across the rows of the `blocks' of source data and an erasure-correction systematic Reed-Solomon (RS) code as the column code. In both the schemes introduced here, we use fixed-length source packets protected with unequal forward error correction coding ensuring a strictly decreasing protection across the bitstream. A Rayleigh flat-fading channel with additive white Gaussian noise (AWGN) is modeled for the transmission. The rate-distortion optimization algorithm is developed and carried out for the selection of source coding and channel coding rates using Lagrangian optimization. The experimental results demonstrate the effectiveness of this system under different wireless channel conditions and both the proposed methods (LDPC+RCPC/CRC and RS+LDPC/CRC) outperform the more conventional schemes such as those employing RCPC/CRC.

  11. A code generation framework for the ALMA common software

    NASA Astrophysics Data System (ADS)

    Troncoso, Nicolás; von Brand, Horst H.; Ibsen, Jorge; Mora, Matias; Gonzalez, Victor; Chiozzi, Gianluca; Jeram, Bogdan; Sommer, Heiko; Zamora, Gabriel; Tejeda, Alexis

    2010-07-01

    Code generation helps in smoothing the learning curve of a complex application framework and in reducing the number of Lines Of Code (LOC) that a developer needs to craft. The ALMA Common Software (ACS) has adopted code generation in specific areas, but we are now exploiting the more comprehensive approach of Model Driven code generation to transform directly an UML Model into a full implementation in the ACS framework. This approach makes it easier for newcomers to grasp the principles of the framework. Moreover, a lower handcrafted LOC reduces the error rate. Additional benefits achieved by model driven code generation are: software reuse, implicit application of design patterns and automatic tests generation. A model driven approach to design makes it also possible using the same model with different frameworks, by generating for different targets. The generation framework presented in this paper uses openArchitectureWare1 as the model to text translator. OpenArchitectureWare provides a powerful functional language that makes this easier to implement the correct mapping of data types, the main difficulty encountered in the translation process. The output is an ACS application readily usable by the developer, including the necessary deployment configuration, thus minimizing any configuration burden during testing. The specific application code is implemented by extending generated classes. Therefore, generated and manually crafted code are kept apart, simplifying the code generation process and aiding the developers by keeping a clean logical separation between the two. Our first results show that code generation improves dramatically the code productivity.

  12. Measuring Diagnoses: ICD Code Accuracy

    PubMed Central

    O'Malley, Kimberly J; Cook, Karon F; Price, Matt D; Wildes, Kimberly Raiford; Hurdle, John F; Ashton, Carol M

    2005-01-01

    Objective To examine potential sources of errors at each step of the described inpatient International Classification of Diseases (ICD) coding process. Data Sources/Study Setting The use of disease codes from the ICD has expanded from classifying morbidity and mortality information for statistical purposes to diverse sets of applications in research, health care policy, and health care finance. By describing a brief history of ICD coding, detailing the process for assigning codes, identifying where errors can be introduced into the process, and reviewing methods for examining code accuracy, we help code users more systematically evaluate code accuracy for their particular applications. Study Design/Methods We summarize the inpatient ICD diagnostic coding process from patient admission to diagnostic code assignment. We examine potential sources of errors at each step and offer code users a tool for systematically evaluating code accuracy. Principle Findings Main error sources along the “patient trajectory” include amount and quality of information at admission, communication among patients and providers, the clinician's knowledge and experience with the illness, and the clinician's attention to detail. Main error sources along the “paper trail” include variance in the electronic and written records, coder training and experience, facility quality-control efforts, and unintentional and intentional coder errors, such as misspecification, unbundling, and upcoding. Conclusions By clearly specifying the code assignment process and heightening their awareness of potential error sources, code users can better evaluate the applicability and limitations of codes for their particular situations. ICD codes can then be used in the most appropriate ways. PMID:16178999

  13. Genetic code for sine

    NASA Astrophysics Data System (ADS)

    Abdullah, Alyasa Gan; Wah, Yap Bee

    2015-02-01

    The computation of the approximate values of the trigonometric sines was discovered by Bhaskara I (c. 600-c.680), a seventh century Indian mathematician and is known as the Bjaskara's I's sine approximation formula. The formula is given in his treatise titled Mahabhaskariya. In the 14th century, Madhava of Sangamagrama, a Kerala mathematician astronomer constructed the table of trigonometric sines of various angles. Madhava's table gives the measure of angles in arcminutes, arcseconds and sixtieths of an arcsecond. The search for more accurate formulas led to the discovery of the power series expansion by Madhava of Sangamagrama (c.1350-c. 1425), the founder of the Kerala school of astronomy and mathematics. In 1715, the Taylor series was introduced by Brook Taylor an English mathematician. If the Taylor series is centered at zero, it is called a Maclaurin series, named after the Scottish mathematician Colin Maclaurin. Some of the important Maclaurin series expansions include trigonometric functions. This paper introduces the genetic code of the sine of an angle without using power series expansion. The genetic code using square root approach reveals the pattern in the signs (plus, minus) and sequence of numbers in the sine of an angle. The square root approach complements the Pythagoras method, provides a better understanding of calculating an angle and will be useful for teaching the concepts of angles in trigonometry.

  14. Determinate-state convolutional codes

    NASA Technical Reports Server (NTRS)

    Collins, O.; Hizlan, M.

    1991-01-01

    A determinate state convolutional code is formed from a conventional convolutional code by pruning away some of the possible state transitions in the decoding trellis. The type of staged power transfer used in determinate state convolutional codes proves to be an extremely efficient way of enhancing the performance of a concatenated coding system. The decoder complexity is analyzed along with free distances of these new codes and extensive simulation results is provided of their performance at the low signal to noise ratios where a real communication system would operate. Concise, practical examples are provided.

  15. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  16. Circular codes, symmetries and transformations.

    PubMed

    Fimmel, Elena; Giannerini, Simone; Gonzalez, Diego Luis; Strüngmann, Lutz

    2015-06-01

    Circular codes, putative remnants of primeval comma-free codes, have gained considerable attention in the last years. In fact they represent a second kind of genetic code potentially involved in detecting and maintaining the normal reading frame in protein coding sequences. The discovering of an universal code across species suggested many theoretical and experimental questions. However, there is a key aspect that relates circular codes to symmetries and transformations that remains to a large extent unexplored. In this article we aim at addressing the issue by studying the symmetries and transformations that connect different circular codes. The main result is that the class of 216 C3 maximal self-complementary codes can be partitioned into 27 equivalence classes defined by a particular set of transformations. We show that such transformations can be put in a group theoretic framework with an intuitive geometric interpretation. More general mathematical results about symmetry transformations which are valid for any kind of circular codes are also presented. Our results pave the way to the study of the biological consequences of the mathematical structure behind circular codes and contribute to shed light on the evolutionary steps that led to the observed symmetries of present codes. PMID:25008961

  17. dc-free coset codes

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Herro, Mark A.

    1988-01-01

    A class of block coset codes with disparity and run-length constraints are studied. They are particularly well suited for high-speed optical fiber links and similar channels, where dc-free pulse formats, channel error control, and low-complexity encoder-decoder implementations are required. The codes are derived by partitioning linear block codes. The encoder and decoder structures are the same as those of linear block codes with only slight modifications. A special class of dc-free coset block codes are derived from BCH codes with specified bounds on minimum distance, disparity, and run length. The codes have low disparity levels (a small running digital sum) and good error-correcting capabilities.

  18. Permutation-invariant quantum codes

    NASA Astrophysics Data System (ADS)

    Ouyang, Yingkai

    2014-12-01

    A quantum code is a subspace of a Hilbert space of a physical system chosen to be correctable against a given class of errors, where information can be encoded. Ideally, the quantum code lies within the ground space of the physical system. When the physical model is the Heisenberg ferromagnet in the absence of an external magnetic field, the corresponding ground space contains all permutation-invariant states. We use techniques from combinatorics and operator theory to construct families of permutation-invariant quantum codes. These codes have length proportional to t2; one family of codes perfectly corrects arbitrary weight t errors, while the other family of codes approximately correct t spontaneous decay errors. The analysis of our codes' performance with respect to spontaneous decay errors utilizes elementary matrix analysis, where we revisit and extend the quantum error correction criterion of Knill and Laflamme, and Leung, Chuang, Nielsen and Yamamoto.

  19. Using National Drug Codes and Drug Knowledge Bases to Organize Prescription Records from Multiple Sources

    PubMed Central

    Simonaitis, Linas; McDonald, Clement J

    2009-01-01

    Purpose Pharmacy systems contain electronic prescription information needed for clinical care, decision support, performance measurements and research. The master files of most pharmacy systems include National Drug Codes (NDCs) as well as the local codes they use within their systems to identify the products they dispense. We sought to assess how well one could map the products dispensed by many pharmacies to clinically oriented codes via the mapping tables provided by Drug Knowledge Base (DKB) producers. Methods We obtained a large sample of prescription records from seven different sources. These records either carried a national product code or a local code that could be translated into a national product code via their formulary master. We obtained mapping tables from five DKBs. We measured the degree to which the DKB mapping tables covered the national product codes carried in, or associated with, our sample of prescription records. Results Considering the total prescription volume, DKBs covered 93.0% to 99.8% of the product codes (15 comparisons) from three outpatient, and 77.4% to 97.0% (20 comparisons) from four inpatient, sources. Among the inpatient sources, invented codes explained much – from 36% to 94% (3 of 4 sources) – of the non coverage. Outpatient pharmacy sources invented codes rarely – in 0.11% to 0.21% of their total prescription volume, and inpatient sources, more commonly – in 1.7% to 7.4% of their prescription volume. The distribution of prescribed products is highly skewed: from 1.4% to 4.4% of codes account for 50% of the message volume; from 10.7% to 34.5% of codes account for 90% of the volume. Conclusion DKBs cover the product codes used by outpatient sources sufficiently well to permit automatic mapping. Changes in policies and standards could increase coverage of product codes used by inpatient sources. PMID:19767382

  20. Making your code citable with the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; DuPrie, Kimberly; Schmidt, Judy; Berriman, G. Bruce; Hanisch, Robert J.; Mink, Jessica D.; Nemiroff, Robert J.; Shamir, Lior; Shortridge, Keith; Taylor, Mark B.; Teuben, Peter J.; Wallin, John F.

    2016-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net) is a free online registry of codes used in astronomy research. With nearly 1,200 codes, it is the largest indexed resource for astronomy codes in existence. Established in 1999, it offers software authors a path to citation of their research codes even without publication of a paper describing the software, and offers scientists a way to find codes used in refereed publications, thus improving the transparency of the research. It also provides a method to quantify the impact of source codes in a fashion similar to the science metrics of journal articles. Citations using ASCL IDs are accepted by major astronomy journals and if formatted properly are tracked by ADS and other indexing services. The number of citations to ASCL entries increased sharply from 110 citations in January 2014 to 456 citations in September 2015. The percentage of code entries in ASCL that were cited at least once rose from 7.5% in January 2014 to 17.4% in September 2015. The ASCL's mid-2014 infrastructure upgrade added an easy entry submission form, more flexible browsing, search capabilities, and an RSS feeder for updates. A Changes/Additions form added this past fall lets authors submit links for papers that use their codes for addition to the ASCL entry even if those papers don't formally cite the codes, thus increasing the transparency of that research and capturing the value of their software to the community.

  1. Practices in Code Discoverability: Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; Teuben, P.; Nemiroff, R. J.; Shamir, L.

    2012-09-01

    Here we describe the Astrophysics Source Code Library (ASCL), which takes an active approach to sharing astrophysics source code. ASCL's editor seeks out both new and old peer-reviewed papers that describe methods or experiments that involve the development or use of source code, and adds entries for the found codes to the library. This approach ensures that source codes are added without requiring authors to actively submit them, resulting in a comprehensive listing that covers a significant number of the astrophysics source codes used in peer-reviewed studies. The ASCL now has over 340 codes in it and continues to grow. In 2011, the ASCL has on average added 19 codes per month. An advisory committee has been established to provide input and guide the development and expansion of the new site, and a marketing plan has been developed and is being executed. All ASCL source codes have been used to generate results published in or submitted to a refereed journal and are freely available either via a download site or from an identified source. This paper provides the history and description of the ASCL. It lists the requirements for including codes, examines the advantages of the ASCL, and outlines some of its future plans.

  2. Surface acoustic wave coding for orthogonal frequency coded devices

    NASA Technical Reports Server (NTRS)

    Malocha, Donald (Inventor); Kozlovski, Nikolai (Inventor)

    2011-01-01

    Methods and systems for coding SAW OFC devices to mitigate code collisions in a wireless multi-tag system. Each device producing plural stepped frequencies as an OFC signal with a chip offset delay to increase code diversity. A method for assigning a different OCF to each device includes using a matrix based on the number of OFCs needed and the number chips per code, populating each matrix cell with OFC chip, and assigning the codes from the matrix to the devices. The asynchronous passive multi-tag system includes plural surface acoustic wave devices each producing a different OFC signal having the same number of chips and including a chip offset time delay, an algorithm for assigning OFCs to each device, and a transceiver to transmit an interrogation signal and receive OFC signals in response with minimal code collisions during transmission.

  3. Peripheral coding of taste

    PubMed Central

    Liman, Emily R.; Zhang, Yali V.; Montell, Craig

    2014-01-01

    Five canonical tastes, bitter, sweet, umami (amino acid), salty and sour (acid) are detected by animals as diverse as fruit flies and humans, consistent with a near universal drive to consume fundamental nutrients and to avoid toxins or other harmful compounds. Surprisingly, despite this strong conservation of basic taste qualities between vertebrates and invertebrates, the receptors and signaling mechanisms that mediate taste in each are highly divergent. The identification over the last two decades of receptors and other molecules that mediate taste has led to stunning advances in our understanding of the basic mechanisms of transduction and coding of information by the gustatory systems of vertebrates and invertebrates. In this review, we discuss recent advances in taste research, mainly from the fly and mammalian systems, and we highlight principles that are common across species, despite stark differences in receptor types. PMID:24607224

  4. IMP: A performance code

    NASA Astrophysics Data System (ADS)

    Dauro, Vincent A., Sr.

    IMP (Integrated Mission Program) is a simulation language and code used to model present and future Earth, Moon, or Mars missions. The profile is user controlled through selection from a large menu of events and maneuvers. A Fehlberg 7/13 Runge-Kutta integrator with error and step size control is used to numerically integrate the differential equations of motion (DEQ) of three spacecraft, a main, a target, and an observer. Through selection, the DEQ's include guided thrust, oblate gravity, atmosphere drag, solar pressure, and Moon gravity effects. Guide parameters for thrust events and performance parameters of velocity changes (Delta-V) and propellant usage (maximum of five systems) are developed as needed. Print, plot, summary, and debug files are output.

  5. Electromagnetic particle simulation codes

    NASA Technical Reports Server (NTRS)

    Pritchett, P. L.

    1985-01-01

    Electromagnetic particle simulations solve the full set of Maxwell's equations. They thus include the effects of self-consistent electric and magnetic fields, magnetic induction, and electromagnetic radiation. The algorithms for an electromagnetic code which works directly with the electric and magnetic fields are described. The fields and current are separated into transverse and longitudinal components. The transverse E and B fields are integrated in time using a leapfrog scheme applied to the Fourier components. The particle pushing is performed via the relativistic Lorentz force equation for the particle momentum. As an example, simulation results are presented for the electron cyclotron maser instability which illustrate the importance of relativistic effects on the wave-particle resonance condition and on wave dispersion.

  6. Telescope Adaptive Optics Code

    SciTech Connect

    Phillion, D.

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The default parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST

  7. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  8. Transionospheric Propagation Code (TIPC)

    NASA Astrophysics Data System (ADS)

    Roussel-Dupre, Robert; Kelley, Thomas A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of VHF signals following propagation through the ionosphere. The code is written in FORTRAN 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, delta times of arrival (DTOA) study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of DTOAs vs TECs for a specified pair of receivers.

  9. Diagonal Eigenvalue Unity (DEU) code for spectral amplitude coding-optical code division multiple access

    NASA Astrophysics Data System (ADS)

    Ahmed, Hassan Yousif; Nisar, K. S.

    2013-08-01

    Code with ideal in-phase cross correlation (CC) and practical code length to support high number of users are required in spectral amplitude coding-optical code division multiple access (SAC-OCDMA) systems. SAC systems are getting more attractive in the field of OCDMA because of its ability to eliminate the influence of multiple access interference (MAI) and also suppress the effect of phase induced intensity noise (PIIN). In this paper, we have proposed new Diagonal Eigenvalue Unity (DEU) code families with ideal in-phase CC based on Jordan block matrix with simple algebraic ways. Four sets of DEU code families based on the code weight W and number of users N for the combination (even, even), (even, odd), (odd, odd) and (odd, even) are constructed. This combination gives DEU code more flexibility in selection of code weight and number of users. These features made this code a compelling candidate for future optical communication systems. Numerical results show that the proposed DEU system outperforms reported codes. In addition, simulation results taken from a commercial optical systems simulator, Virtual Photonic Instrument (VPI™) shown that, using point to multipoint transmission in passive optical network (PON), DEU has better performance and could support long span with high data rate.

  10. Some easily analyzable convolutional codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R.; Dolinar, S.; Pollara, F.; Vantilborg, H.

    1989-01-01

    Convolutional codes have played and will play a key role in the downlink telemetry systems on many NASA deep-space probes, including Voyager, Magellan, and Galileo. One of the chief difficulties associated with the use of convolutional codes, however, is the notorious difficulty of analyzing them. Given a convolutional code as specified, say, by its generator polynomials, it is no easy matter to say how well that code will perform on a given noisy channel. The usual first step in such an analysis is to computer the code's free distance; this can be done with an algorithm whose complexity is exponential in the code's constraint length. The second step is often to calculate the transfer function in one, two, or three variables, or at least a few terms in its power series expansion. This step is quite hard, and for many codes of relatively short constraint lengths, it can be intractable. However, a large class of convolutional codes were discovered for which the free distance can be computed by inspection, and for which there is a closed-form expression for the three-variable transfer function. Although for large constraint lengths, these codes have relatively low rates, they are nevertheless interesting and potentially useful. Furthermore, the ideas developed here to analyze these specialized codes may well extend to a much larger class.

  11. Source Term Code Package: a user's guide (Mod 1)

    SciTech Connect

    Gieseke, J.A.; Cybulskis, P.; Jordan, H.; Lee, K.W.; Schumacher, P.M.; Curtis, L.A.; Wooton, R.O.; Quayle, S.F.; Kogan, V.

    1986-07-01

    As part of a major reassessment of the release of radioactive materials to the environment (source terms) in severe reactor accidents, a group of state-of-the-art computer codes was utilized to perform extensive analyses. A major product of this source term reassessment effort was a demonstrated methodology for analyzing specific accident situations to provide source term predictions. The computer codes forming this methodology have been upgraded and modified for release and further use. This system of codes has been named the Source Term Code Package (STCP) and is the subject of this user's guide. The guide is intended to provide an understanding of the STCP structure and to facilitate STCP use. The STCP was prepared for operation on a CDC system but is written in FORTRAN-77 to permit transportability. In the current version (Mod 1) of the STCP, the various calculational elements fall into four major categories represented by the codes MARCH3, TRAP-MELT3, VANESA, and NAUA/SPARC/ICEDF. The MARCH3 code is a combination of the MARCH2, CORSOR-M, and CORCON-Mod 2 codes. The TRAP-MELT3 code is a combination of the TRAP-MELT2.0 and MERGE codes.

  12. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    Model based program design is the most clear and direct way to develop algorithms and programs for interfacing with hardware. While coding "by hand" results in a more tailored product, the ever-growing size and complexity of modern-day applications can cause the project work load to quickly become unreasonable for one programmer. This has generally been addressed by splitting the product into separate modules to allow multiple developers to work in parallel on the same project, however this introduces new potentials for errors in the process. The fluidity, reliability and robustness of the code relies on the abilities of the programmers to communicate their methods to one another; furthermore, multiple programmers invites multiple potentially differing coding styles into the same product, which can cause a loss of readability or even module incompatibility. Fortunately, Mathworks has implemented an auto-coding feature that allows programmers to design their algorithms through the use of models and diagrams in the graphical programming environment Simulink, allowing the designer to visually determine what the hardware is to do. From here, the auto-coding feature handles converting the project into another programming language. This type of approach allows the designer to clearly see how the software will be directing the hardware without the need to try and interpret large amounts of code. In addition, it speeds up the programming process, minimizing the amount of man-hours spent on a single project, thus reducing the chance of human error as well as project turnover time. One such project that has benefited from the auto-coding procedure is Ramses, a portion of the GNC flight software on-board Orion that has been implemented primarily in Simulink. Currently, however, auto-coding Ramses into C++ requires 5 hours of code generation time. This causes issues if the tool ever needs to be debugged, as this code generation will need to occur with each edit to any part of

  13. The World in a Tomato: Revisiting the Use of "Codes" in Freire's Problem-Posing Education.

    ERIC Educational Resources Information Center

    Barndt, Deborah

    1998-01-01

    Gives examples of the use of Freire's notion of codes or generative themes in problem-posing literacy education. Describes how these applications expand Freire's conceptions by involving students in code production, including multicultural perspectives, and rethinking codes as representations. (SK)

  14. Nonlinear, nonbinary cyclic group codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    New cyclic group codes of length 2(exp m) - 1 over (m - j)-bit symbols are introduced. These codes can be systematically encoded and decoded algebraically. The code rates are very close to Reed-Solomon (RS) codes and are much better than Bose-Chaudhuri-Hocquenghem (BCH) codes (a former alternative). The binary (m - j)-tuples are identified with a subgroup of the binary m-tuples which represents the field GF(2 exp m). Encoding is systematic and involves a two-stage procedure consisting of the usual linear feedback register (using the division or check polynomial) and a small table lookup. For low rates, a second shift-register encoding operation may be invoked. Decoding uses the RS error-correcting procedures for the m-tuple codes for m = 4, 5, and 6.

  15. QR code for medical information uses.

    PubMed

    Fontelo, Paul; Liu, Fang; Ducut, Erick G

    2008-01-01

    We developed QR code online tools, simulated and tested QR code applications for medical information uses including scanning QR code labels, URLs and authentication. Our results show possible applications for QR code in medicine. PMID:18998785

  16. Transient Ejector Analysis (TEA) code user's guide

    NASA Technical Reports Server (NTRS)

    Drummond, Colin K.

    1993-01-01

    A FORTRAN computer program for the semi analytic prediction of unsteady thrust augmenting ejector performance has been developed, based on a theoretical analysis for ejectors. That analysis blends classic self-similar turbulent jet descriptions with control-volume mixing region elements. Division of the ejector into an inlet, diffuser, and mixing region allowed flexibility in the modeling of the physics for each region. In particular, the inlet and diffuser analyses are simplified by a quasi-steady-analysis, justified by the assumption that pressure is the forcing function in those regions. Only the mixing region is assumed to be dominated by viscous effects. The present work provides an overview of the code structure, a description of the required input and output data file formats, and the results for a test case. Since there are limitations to the code for applications outside the bounds of the test case, the user should consider TEA as a research code (not as a production code), designed specifically as an implementation of the proposed ejector theory. Program error flags are discussed, and some diagnostic routines are presented.

  17. Theory of quantum error-correcting codes

    SciTech Connect

    Knill, E.; Laflamme, R.

    1997-02-01

    Quantum error correction will be necessary for preserving coherent states against noise and other unwanted interactions in quantum computation and communication. We develop a general theory of quantum error correction based on encoding states into larger Hilbert spaces subject to known interactions. We obtain necessary and sufficient conditions for the perfect recovery of an encoded state after its degradation by an interaction. The conditions depend only on the behavior of the logical states. We use them to give a recovery-operator-independent definition of error-correcting codes. We relate this definition to four others: the existence of a left inverse of the interaction, an explicit representation of the error syndrome using tensor products, perfect recovery of the completely entangled state, and an information theoretic identity. Two notions of fidelity and error for imperfect recovery are introduced, one for pure and the other for entangled states. The latter is more appropriate when using codes in a quantum memory or in applications of quantum teleportation to communication. We show that the error for entangled states is bounded linearly by the error for pure states. A formal definition of independent interactions for qubits is given. This leads to lower bounds on the number of qubits required to correct e errors and a formal proof that the classical bounds on the probability of error of e-error-correcting codes applies to e-error-correcting quantum codes, provided that the interaction is dominated by an identity component. {copyright} {ital 1997} {ital The American Physical Society}

  18. Explosive Formulation Code Naming SOP

    SciTech Connect

    Martz, H. E.

    2014-09-19

    The purpose of this SOP is to provide a procedure for giving individual HME formulations code names. A code name for an individual HME formulation consists of an explosive family code, given by the classified guide, followed by a dash, -, and a number. If the formulation requires preparation such as packing or aging, these add additional groups of symbols to the X-ray specimen name.

  19. Bar-Code-Scribing Tool

    NASA Technical Reports Server (NTRS)

    Badinger, Michael A.; Drouant, George J.

    1991-01-01

    Proposed hand-held tool applies indelible bar code to small parts. Possible to identify parts for management of inventory without tags or labels. Microprocessor supplies bar-code data to impact-printer-like device. Device drives replaceable scribe, which cuts bar code on surface of part. Used to mark serially controlled parts for military and aerospace equipment. Also adapts for discrete marking of bulk items used in food and pharmaceutical processing.

  20. Turbo Codes for PCS Applications

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    A number of the claims for turbo codes as a great advance in coding theory are confirmed, and a complete description is presented of an encoder/decoder pair that could be suitable for PCS applications. A new simple method for trellis termination is described, the effect of interleaver choice on the weight distribution of the code is analyzed, and unequal rate components (which yield better performance) are introduced.

  1. Aspen Code Development Collaboration

    SciTech Connect

    none,; Cherry, Robert S.; Richard, Boardman D.

    2013-10-03

    Wyoming has a wealth of primary energy resources in the forms of coal, natural gas, wind, uranium, and oil shale. Most of Wyoming?s coal and gas resources are exported from the state in unprocessed form rather than as refined higher value products. Wyoming?s leadership recognizes the opportunity to broaden the state?s economic base energy resources to make value-added products such as synthetic vehicle fuels and commodity chemicals. Producing these higher value products in an environmentally responsible manner can benefit from the use of clean energy technologies including Wyoming?s abundant wind energy and nuclear energy such as new generation small modular reactors including the high temperature gas-cooled reactors.

  2. The FLUKA Code: an Overview

    SciTech Connect

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M.V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P.R.; /Milan U. /INFN, Milan /Pavia U. /INFN, Pavia /CERN /Siegen U. /Houston U. /SLAC /Frascati /NASA, Houston /ENEA, Frascati

    2005-11-09

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  3. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  4. The FLUKA Code: An Overview

    NASA Technical Reports Server (NTRS)

    Ballarini, F.; Battistoni, G.; Campanella, M.; Carboni, M.; Cerutti, F.; Empl, A.; Fasso, A.; Ferrari, A.; Gadioli, E.; Garzelli, M. V.; Lantz, M.; Liotta, M.; Mairani, A.; Mostacci, A.; Muraro, S.; Ottolenghi, A.; Pelliccioni, M.; Pinsky, L.; Ranft, J.; Roesler, S.; Sala, P. R.; Scannicchio, D.; Trovati, S.; Villari, R.; Wilson, T.

    2006-01-01

    FLUKA is a multipurpose Monte Carlo code which can transport a variety of particles over a wide energy range in complex geometries. The code is a joint project of INFN and CERN: part of its development is also supported by the University of Houston and NASA. FLUKA is successfully applied in several fields, including but not only, particle physics, cosmic ray physics, dosimetry, radioprotection, hadron therapy, space radiation, accelerator design and neutronics. The code is the standard tool used at CERN for dosimetry, radioprotection and beam-machine interaction studies. Here we give a glimpse into the code physics models with a particular emphasis to the hadronic and nuclear sector.

  5. Multilevel codes and multistage decoding

    NASA Astrophysics Data System (ADS)

    Calderbank, A. R.

    1989-03-01

    Imai and Hirakawa have proposed (1977) a multilevel coding method based on binary block codes that admits a staged decoding procedure. Here the coding method is extended to coset codes and it is shown how to calculate minimum squared distance and path multiplicity in terms of the norms and multiplicities of the different cosets. The multilevel structure allows the redundancy in the coset selection procedure to be allocated efficiently among the different levels. It also allows the use of suboptimal multistage decoding procedures that have performance/complexity advantages over maximum-likelihood decoding.

  6. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6x4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (63,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  7. Golay and other box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    The (24,12;8) extended Golay Code can be generated as a 6 x 4 binary matrix from the (15,11;3) BCH-Hamming Code, represented as a 5 x 3 matrix, by adding a row and a column, both of odd or even parity. The odd-parity case provides the additional 12th dimension. Furthermore, any three columns and five rows of the 6 x 4 Golay form a BCH-Hamming (15,11;3) Code. Similarly a (80,58;8) code can be generated as a 10 x 8 binary matrix from the (63,57;3) BCH-Hamming Code represented as a 9 x 7 matrix by adding a row and a column both of odd and even parity. Furthermore, any seven columns along with the top nine rows is a BCH-Hamming (53,57;3) Code. A (80,40;16) 10 x 8 matrix binary code with weight structure identical to the extended (80,40;16) Quadratic Residue Code is generated from a (63,39;7) binary cyclic code represented as a 9 x 7 matrix, by adding a row and a column, both of odd or even parity.

  8. High Order Modulation Protograph Codes

    NASA Technical Reports Server (NTRS)

    Nguyen, Thuy V. (Inventor); Nosratinia, Aria (Inventor); Divsalar, Dariush (Inventor)

    2014-01-01

    Digital communication coding methods for designing protograph-based bit-interleaved code modulation that is general and applies to any modulation. The general coding framework can support not only multiple rates but also adaptive modulation. The method is a two stage lifting approach. In the first stage, an original protograph is lifted to a slightly larger intermediate protograph. The intermediate protograph is then lifted via a circulant matrix to the expected codeword length to form a protograph-based low-density parity-check code.

  9. Parallelization of the SIR code

    NASA Astrophysics Data System (ADS)

    Thonhofer, S.; Bellot Rubio, L. R.; Utz, D.; Jurčak, J.; Hanslmeier, A.; Piantschitsch, I.; Pauritsch, J.; Lemmerer, B.; Guttenbrunner, S.

    A high-resolution 3-dimensional model of the photospheric magnetic field is essential for the investigation of small-scale solar magnetic phenomena. The SIR code is an advanced Stokes-inversion code that deduces physical quantities, e.g. magnetic field vector, temperature, and LOS velocity, from spectropolarimetric data. We extended this code by the capability of directly using large data sets and inverting the pixels in parallel. Due to this parallelization it is now feasible to apply the code directly on extensive data sets. Besides, we included the possibility to use different initial model atmospheres for the inversion, which enhances the quality of the results.

  10. Coding and traceability in Iran.

    PubMed

    Aghayan, Hamid Reza; Mahdavi-Mazdeh, Mitra; Goodarzi, Parisa; Arjmand, Babak; Emami-Razavi, Seyed Hassan

    2010-11-01

    Transplantation has a long history in Iran. Cornea was the first tissue transplantation in 1935. The Central Eye Bank of Iran was established in 1991 and the Iranian Tissue Bank (ITB) in 1994. Now, there are also some private cell and tissue banks in the country, that produce different tissue grafts such as homograft heart valves, musculoskeletal tissues, soft tissues, cartilages, pericardium, amniotic membrane and some cell based products. There is not a separate legislation for tissue transplantation but the legal framework for tissue donation is based on the "Deceased or Brain dead patient organ transplantation" act (passed on April 6, 2000). For tissue banking there is no regulatory oversight by the national health authority. To increase the level of safety and considering the importance of effective traceability, each tissue bank has its own policy and terminology for coding and documentation without any correlation to others. In some cases tissue banks have implemented ISO based standards (i.e., ISO 9001) as a basic quality management system. PMID:20953716

  11. The Fireball integrated code package

    SciTech Connect

    Dobranich, D.; Powers, D.A.; Harper, F.T.

    1997-07-01

    Many deep-space satellites contain a plutonium heat source. An explosion, during launch, of a rocket carrying such a satellite offers the potential for the release of some of the plutonium. The fireball following such an explosion exposes any released plutonium to a high-temperature chemically-reactive environment. Vaporization, condensation, and agglomeration processes can alter the distribution of plutonium-bearing particles. The Fireball code package simulates the integrated response of the physical and chemical processes occurring in a fireball and the effect these processes have on the plutonium-bearing particle distribution. This integrated treatment of multiple phenomena represents a significant improvement in the state of the art for fireball simulations. Preliminary simulations of launch-second scenarios indicate: (1) most plutonium vaporization occurs within the first second of the fireball; (2) large non-aerosol-sized particles contribute very little to plutonium vapor production; (3) vaporization and both homogeneous and heterogeneous condensation occur simultaneously; (4) homogeneous condensation transports plutonium down to the smallest-particle sizes; (5) heterogeneous condensation precludes homogeneous condensation if sufficient condensation sites are available; and (6) agglomeration produces larger-sized particles but slows rapidly as the fireball grows.

  12. CODE's multi-GNSS orbit and clock solution

    NASA Astrophysics Data System (ADS)

    Prange, Lars; Orliac, Etienne; Dach, Rolf; Arnold, Daniel; Beutler, Gerhard; Schaer, Stefan; Jäggi, Adrian

    2015-04-01

    The Center for Orbit Determination in Europe (CODE) is contributing as a global analysis center to the International GNSS Service (IGS). Since 2012 CODE participates in the "Multi-GNSS EXperiment" (MGEX), launched by the IGS as a testbed for the incorporation of new GNSS and their signals into the existing IGS processing chains and software packages. We present CODE's latest MGEX solution - a fully integrated 5-system (GPS, GLONASS, Galileo, BeiDou, QZSS) GNSS orbit and clock solution, based on data starting from January 2014. The impact of radiation pressure modeling and orbital arc length on the solution quality will be discussed. The results will be validated with satellite laser ranging (SLR), assessment of satellite clock performance, and precise point positioning (PPP). The CODE MGEX orbit and clock products are publicly available in the IGS MGEX products directory at the CDDIS data center: ftp://cddis.gsfc.nasa.gov/gnss/products/mgex (the solution ID "com" stands for CODE-MGEX). The CODE MGEX products have been generated occasionally so far. Beginning in early 2015 they are provided in a more operational way with a delay of about two weeks.

  13. SOPHAEROS code development and its application to falcon tests

    SciTech Connect

    Lajtha, G.; Missirlian, M.; Kissane, M.

    1996-12-31

    One of the key issues in source-term evaluation in nuclear reactor severe accidents is determination of the transport behavior of fission products released from the degrading core. The SOPHAEROS computer code is being developed to predict fission product transport in a mechanistic way in light water reactor circuits. These applications of the SOPHAEROS code to the Falcon experiments, among others not presented here, indicate that the numerical scheme of the code is robust, and no convergence problems are encountered. The calculation is also very fast being three times longer on a Sun SPARC 5 workstation than real time and typically {approx} 10 times faster than an identical calculation with the VICTORIA code. The study demonstrates that the SOPHAEROS 1.3 code is a suitable tool for prediction of the vapor chemistry and fission product transport with a reasonable level of accuracy. Furthermore, the fexibility of the code material data bank allows improvement of understanding of fission product transport and deposition in the circuit. Performing sensitivity studies with different chemical species or with different properties (saturation pressure, chemical equilibrium constants) is very straightforward.

  14. Patched Conic Trajectory Code

    NASA Technical Reports Server (NTRS)

    Park, Brooke Anderson; Wright, Henry

    2012-01-01

    PatCon code was developed to help mission designers run trade studies on launch and arrival times for any given planet. Initially developed in Fortran, the required inputs included launch date, arrival date, and other orbital parameters of the launch planet and arrival planets at the given dates. These parameters include the position of the planets, the eccentricity, semi-major axes, argument of periapsis, ascending node, and inclination of the planets. With these inputs, a patched conic approximation is used to determine the trajectory. The patched conic approximation divides the planetary mission into three parts: (1) the departure phase, in which the two relevant bodies are Earth and the spacecraft, and where the trajectory is a departure hyperbola with Earth at the focus; (2) the cruise phase, in which the two bodies are the Sun and the spacecraft, and where the trajectory is a transfer ellipse with the Sun at the focus; and (3) the arrival phase, in which the two bodies are the target planet and the spacecraft, where the trajectory is an arrival hyperbola with the planet as the focus.

  15. Telescope Adaptive Optics Code

    Energy Science and Technology Software Center (ESTSC)

    2005-07-28

    The Telescope AO Code has general adaptive optics capabilities plus specialized models for three telescopes with either adaptive optics or active optics systems. It has the capability to generate either single-layer or distributed Kolmogorov turbulence phase screens using the FFT. Missing low order spatial frequencies are added using the Karhunen-Loeve expansion. The phase structure curve is extremely dose to the theoreUcal. Secondly, it has the capability to simulate an adaptive optics control systems. The defaultmore » parameters are those of the Keck II adaptive optics system. Thirdly, it has a general wave optics capability to model the science camera halo due to scintillation from atmospheric turbulence and the telescope optics. Although this capability was implemented for the Gemini telescopes, the only default parameter specific to the Gemini telescopes is the primary mirror diameter. Finally, it has a model for the LSST active optics alignment strategy. This last model is highly specific to the LSST« less

  16. SCDAP/RELAP5 code development and assessment

    SciTech Connect

    Allison, C.M.; Hohorst, J.K.

    1996-03-01

    The SCDAP/RELAP5 computer code is designed to describe the overall reactor coolant system thermal-hydraulic response, core damage progression, and fission product release during severe accidents. The code is being developed at the Idaho National Engineering Laboratory under the primary sponsorship of the Office of Nuclear Regulatory Research of the U.S. Nuclear Regulatory Commission. The current version of the code is SCDAP/RELAP5/MOD3.1e. Although MOD3.1e contains a number of significant improvements since the initial version of MOD3.1 was released, new models to treat the behavior of the fuel and cladding during reflood have had the most dramatic impact on the code`s calculations. This paper provides a brief description of the new reflood models, presents highlights of the assessment of the current version of MOD3.1, and discusses future SCDAP/RELAP5/MOD3.2 model development activities.

  17. A novel unified coding analytical method for Internet of Things

    NASA Astrophysics Data System (ADS)

    Sun, Hong; Zhang, JianHong

    2013-08-01

    This paper presents a novel unified coding analytical method for Internet of Things, which abstracts out the `displacement goods' and `physical objects', and expounds the relationship thereof. It details the item coding principles, establishes a one-to-one relationship between three-dimensional spatial coordinates of points and global manufacturers, can infinitely expand, solves the problem of unified coding in production phase and circulation phase with a novel unified coding method, and further explains how to update the item information corresponding to the coding in stages of sale and use, so as to meet the requirement that the Internet of Things can carry out real-time monitoring and intelligentized management to each item.

  18. Error coding simulations in C

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1994-01-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  19. Error coding simulations in C

    NASA Astrophysics Data System (ADS)

    Noble, Viveca K.

    1994-10-01

    When data is transmitted through a noisy channel, errors are produced within the data rendering it indecipherable. Through the use of error control coding techniques, the bit error rate can be reduced to any desired level without sacrificing the transmission data rate. The Astrionics Laboratory at Marshall Space Flight Center has decided to use a modular, end-to-end telemetry data simulator to simulate the transmission of data from flight to ground and various methods of error control. The simulator includes modules for random data generation, data compression, Consultative Committee for Space Data Systems (CCSDS) transfer frame formation, error correction/detection, error generation and error statistics. The simulator utilizes a concatenated coding scheme which includes CCSDS standard (255,223) Reed-Solomon (RS) code over GF(2(exp 8)) with interleave depth of 5 as the outermost code, (7, 1/2) convolutional code as an inner code and CCSDS recommended (n, n-16) cyclic redundancy check (CRC) code as the innermost code, where n is the number of information bits plus 16 parity bits. The received signal-to-noise for a desired bit error rate is greatly reduced through the use of forward error correction techniques. Even greater coding gain is provided through the use of a concatenated coding scheme. Interleaving/deinterleaving is necessary to randomize burst errors which may appear at the input of the RS decoder. The burst correction capability length is increased in proportion to the interleave depth. The modular nature of the simulator allows for inclusion or exclusion of modules as needed. This paper describes the development and operation of the simulator, the verification of a C-language Reed-Solomon code, and the possibility of using Comdisco SPW(tm) as a tool for determining optimal error control schemes.

  20. Vision-based reading system for color-coded bar codes

    NASA Astrophysics Data System (ADS)

    Schubert, Erhard; Schroeder, Axel

    1996-02-01

    Barcode systems are used to mark commodities, articles and products with price and article numbers. The advantage of the barcode systems is the safe and rapid availability of the information about the product. The size of the barcode depends on the used barcode system and the resolution of the barcode scanner. Nevertheless, there is a strong correlation between the information content and the length of the barcode. To increase the information content, new 2D-barcode systems like CodaBlock or PDF-417 are introduced. In this paper we present a different way to increase the information content of a barcode and we would like to introduce the color coded barcode. The new color coded barcode is created by offset printing of the three colored barcodes, each barcode with different information. Therefore, three times more information content can be accommodated in the area of a black printed barcode. This kind of color coding is usable in case of the standard 1D- and 2D-barcodes. We developed two reading devices for the color coded barcodes. First, there is a vision based system, consisting of a standard color camera and a PC-based color frame grabber. Omnidirectional barcode decoding is possible with this reading device. Second, a bi-directional handscanner was developed. Both systems use a color separation process to separate the color image of the barcodes into three independent grayscale images. In the case of the handscanner the image consists of one line only. After the color separation the three grayscale barcodes can be decoded with standard image processing methods. In principle, the color coded barcode can be used everywhere instead of the standard barcode. Typical applications with the color coded barcodes are found in the medicine technique, stock running and identification of electronic modules.

  1. A High-Rate Space-Time Block Code with Full Diversity

    NASA Astrophysics Data System (ADS)

    Gao, Zhenzhen; Zhu, Shihua; Zhong, Zhimeng

    A new high-rate space-time block code (STBC) with full transmit diversity gain for four transmit antennas based on a generalized Alamouti code structure is proposed. The proposed code has lower Maximum Likelihood (ML) decoding complexity than the Double ABBA scheme does. Constellation rotation is used to maximize the diversity product. With the optimal rotated constellations, the proposed code significantly outperforms some known high-rate STBCs in the literature with similar complexity and the same spectral efficiency.

  2. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. PMID:25894105

  3. Intranuclear cascade with emission of light fragment code implemented in the transport code system PHITS

    NASA Astrophysics Data System (ADS)

    Sawada, Y.; Uozumi, Y.; Nogamine, S.; Yamada, T.; Iwamoto, Y.; Sato, T.; Niita, K.

    2012-11-01

    The Intranuclear Cascade with Emission of Light Fragment (INC-ELF) code has been developed and implemented in the Particle and Heavy Ion Transport code System (PHITS). The INC-ELF code explicitly includes nucleon correlations within the framework of the INC model to describe light fragment emissions from nuclear spallation reactions by using the model in Phys. Rev. C 84, (2011) 064617. In addition to the degrees of freedom of nucleons, the developed code also accounts for pions, Δs, and N∗s, and can cover energy ranges up to 3 GeV. The predictive capabilities of the ELF/PHITS system have been verified through comparison with a diverse set of experimental observations. In particular, the verification was conducted with abundant double-differential cross-section data covering a wide range of reactions (e.g., (p, p'x), (p, nx), (p, dx), (p, 3Hex), (p, αx) and (p, πx) reactions) over a wide energy range (between 400 MeV and 1.5 GeV). As a result, our ELF/PHITS code has demonstrated strong predictive capability for all of these data, although areas requiring future study remain due to the lack of experimental data on high-energy cluster production.

  4. The CRISP Code for Nuclear Reactions

    SciTech Connect

    Anefalos, S.; Deppman, A.; Silva, Gilson da; Arruda-Neto, J.D. T.; Garcia, F.

    2005-05-24

    The CRISP package performs the intranuclear cascade process and the evaporation/fission competition resulting in a code that represents a good tool to describe complexes characteristics of the nuclear reactions, and opens the opportunity for applications in different fields, such as medical physics, photonuclear reactions, spallation or fission process initiated by different probes and in Accelerator Driven Systems, where precise description of energetic and angular neutron distribution, neutron multiplicity and spallation products information are needed. In the CRISP model, was included the time-sequence characteristics of the MCMC code and the evaporation/fission competition process model of the MCEF. Also, includes improvements in the code, as the excitation of nucleonic resonances heavier than Delta; the initial nuclear ground state construction according to the Fermi model and Pauli principle; and a more realistic Pauli blocking mechanism. Some consequences of the improvements performed in the code will be discussed, as, e.g., the absence of Pauli Principle violations observed in the occupation number for single-particle bound states, and the absence the lack of the unphysical nuclear boiling. At the present two other reaction channels are being includes, namely, the quasi-deuteron mechanism at energies between 40 MeV and 140 MeV, and the photon hadronization process, which gives rise to the shadowing effect. With these modifications it will be possible to use the CRISP code for energies above 40 MeV up to a few GeV not only for reactions initiated by protons and neutrons, but also by photons. We will describe some of the consequences resulting of these modifications and present some results in order to illustrate the possible applications, for which this package can be used, mainly those related to spallation process involving high-energy protons.

  5. Indices for Testing Neural Codes

    PubMed Central

    Victor, Jonathan D.; Nirenberg, Sheila

    2009-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is smaller than the latter (i.e., if the code cannot account for the behavior), the code can be ruled out. The information-theoretic index most widely used in this context is Shannon’s mutual information. The Shannon test, however, is not ideal for this purpose: while the codes it will rule out are truly nonviable, there will be some nonviable codes that it will fail to rule out. Here we describe a wide range of alternative indices that can be used for ruling codes out. The range includes a continuum from Shannon information to measures of the performance of a Bayesian decoder. We analyze the relationship of these indices to each other and their complementary strengths and weaknesses for addressing this problem. PMID:18533812

  6. Accelerator Physics Code Web Repository

    SciTech Connect

    Zimmermann, F.; Basset, R.; Bellodi, G.; Benedetto, E.; Dorda, U.; Giovannozzi, M.; Papaphilippou, Y.; Pieloni, T.; Ruggiero, F.; Rumolo, G.; Schmidt, F.; Todesco, E.; Zotter, B.W.; Payet, J.; Bartolini, R.; Farvacque, L.; Sen, T.; Chin, Y.H.; Ohmi, K.; Oide, K.; Furman, M.; /LBL, Berkeley /Oak Ridge /Pohang Accelerator Lab. /SLAC /TRIUMF /Tech-X, Boulder /UC, San Diego /Darmstadt, GSI /Rutherford /Brookhaven

    2006-10-24

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  7. ACCELERATION PHYSICS CODE WEB REPOSITORY.

    SciTech Connect

    WEI, J.

    2006-06-26

    In the framework of the CARE HHH European Network, we have developed a web-based dynamic accelerator-physics code repository. We describe the design, structure and contents of this repository, illustrate its usage, and discuss our future plans, with emphasis on code benchmarking.

  8. Video coding with dynamic background

    NASA Astrophysics Data System (ADS)

    Paul, Manoranjan; Lin, Weisi; Lau, Chiew Tong; Lee, Bu-Sung

    2013-12-01

    Motion estimation (ME) and motion compensation (MC) using variable block size, sub-pixel search, and multiple reference frames (MRFs) are the major reasons for improved coding performance of the H.264 video coding standard over other contemporary coding standards. The concept of MRFs is suitable for repetitive motion, uncovered background, non-integer pixel displacement, lighting change, etc. The requirement of index codes of the reference frames, computational time in ME & MC, and memory buffer for coded frames limits the number of reference frames used in practical applications. In typical video sequences, the previous frame is used as a reference frame with 68-92% of cases. In this article, we propose a new video coding method using a reference frame [i.e., the most common frame in scene (McFIS)] generated by dynamic background modeling. McFIS is more effective in terms of rate-distortion and computational time performance compared to the MRFs techniques. It has also inherent capability of scene change detection (SCD) for adaptive group of picture (GOP) size determination. As a result, we integrate SCD (for GOP determination) with reference frame generation. The experimental results show that the proposed coding scheme outperforms the H.264 video coding with five reference frames and the two relevant state-of-the-art algorithms by 0.5-2.0 dB with less computational time.

  9. LFSC - Linac Feedback Simulation Code

    SciTech Connect

    Ivanov, Valentin; /Fermilab

    2008-05-01

    The computer program LFSC (Code>) is a numerical tool for simulation beam based feedback in high performance linacs. The code LFSC is based on the earlier version developed by a collective of authors at SLAC (L.Hendrickson, R. McEwen, T. Himel, H. Shoaee, S. Shah, P. Emma, P. Schultz) during 1990-2005. That code was successively used in simulation of SLC, TESLA, CLIC and NLC projects. It can simulate as pulse-to-pulse feedback on timescale corresponding to 5-100 Hz, as slower feedbacks, operating in the 0.1-1 Hz range in the Main Linac and Beam Delivery System. The code LFSC is running under Matlab for MS Windows operating system. It contains about 30,000 lines of source code in more than 260 subroutines. The code uses the LIAR ('Linear Accelerator Research code') for particle tracking under ground motion and technical noise perturbations. It uses the Guinea Pig code to simulate the luminosity performance. A set of input files includes the lattice description (XSIF format), and plane text files with numerical parameters, wake fields, ground motion data etc. The Matlab environment provides a flexible system for graphical output.

  10. Computer algorithm for coding gain

    NASA Technical Reports Server (NTRS)

    Dodd, E. E.

    1974-01-01

    Development of a computer algorithm for coding gain for use in an automated communications link design system. Using an empirical formula which defines coding gain as used in space communications engineering, an algorithm is constructed on the basis of available performance data for nonsystematic convolutional encoding with soft-decision (eight-level) Viterbi decoding.

  11. Strongly Secure Linear Network Coding

    NASA Astrophysics Data System (ADS)

    Harada, Kunihiko; Yamamoto, Hirosuke

    In a network with capacity h for multicast, information Xh=(X1, X2, …, Xh) can be transmitted from a source node to sink nodes without error by a linear network code. Furthermore, secret information Sr=(S1, S2, …, Sr) can be transmitted securely against wiretappers by k-secure network coding for k≤h-r. In this case, no information of the secret leaks out even if an adversary wiretaps k edges, i. e. channels. However, if an adversary wiretaps k+1 edges, some Si may leak out explicitly. In this paper, we propose strongly k-secure network coding based on strongly secure ramp secret sharing schemes. In this coding, no information leaks out for every (Si1, Si2, …,Sir-j) even if an adversary wiretaps k+j channels. We also give an algorithm to construct a strongly k-secure network code directly and a transform to convert a nonsecure network code to a strongly k-secure network code. Furthermore, some sufficient conditions of alphabet size to realize the strongly k-secure network coding are derived for the case of k

  12. QPhiX Code Generator

    Energy Science and Technology Software Center (ESTSC)

    2014-09-16

    A simple code-generator to generate the low level code kernels used by the QPhiX Library for Lattice QCD. Generates Kernels for Wilson-Dslash, and Wilson-Clover kernels. Can be reused to write other optimized kernels for Intel Xeon Phi(tm), Intel Xeon(tm) and potentially other architectures.

  13. Using NAEYC's Code of Ethics.

    ERIC Educational Resources Information Center

    Young Children, 1995

    1995-01-01

    Considers how to deal with an ethical dilemma concerning a caregiver's dislike for a child. Recognizes that no statement in NAEYC's Code of Ethical Conduct requires that a professional must like each child, and presents some ideals and principles from the code that may guide professionals through similar situations. (BAC)

  14. Cracking the bioelectric code

    PubMed Central

    Tseng, AiSun; Levin, Michael

    2013-01-01

    Patterns of resting potential in non-excitable cells of living tissue are now known to be instructive signals for pattern formation during embryogenesis, regeneration and cancer suppression. The development of molecular-level techniques for tracking ion flows and functionally manipulating the activity of ion channels and pumps has begun to reveal the mechanisms by which voltage gradients regulate cell behaviors and the assembly of complex large-scale structures. A recent paper demonstrated that a specific voltage range is necessary for demarcation of eye fields in the frog embryo. Remarkably, artificially setting other somatic cells to the eye-specific voltage range resulted in formation of eyes in aberrant locations, including tissues that are not in the normal anterior ectoderm lineage: eyes could be formed in the gut, on the tail, or in the lateral plate mesoderm. These data challenge the existing models of eye fate restriction and tissue competence maps, and suggest the presence of a bioelectric code—a mapping of physiological properties to anatomical outcomes. This Addendum summarizes the current state of knowledge in developmental bioelectricity, proposes three possible interpretations of the bioelectric code that functionally maps physiological states to anatomical outcomes, and highlights the biggest open questions in this field. We also suggest a speculative hypothesis at the intersection of cognitive science and developmental biology: that bioelectrical signaling among non-excitable cells coupled by gap junctions simulates neural network-like dynamics, and underlies the information processing functions required by complex pattern formation in vivo. Understanding and learning to control the information stored in physiological networks will have transformative implications for developmental biology, regenerative medicine and synthetic bioengineering. PMID:23802040

  15. Coding polymorphism for phylogeny reconstruction.

    PubMed

    Kornet, D J; Turner, H

    1999-06-01

    The methodology of coding polymorphic taxa has received limited attention to date. A search of the taxonomic literature revealed seven types of coding methods. Apart from ignoring polymorphic characters (sometimes called the fixed-only method), two main categories can be distinguished: methods that identify the start of a new character state with the origin of an evolutionary novelty, and methods that identify the new state with the fixation of a novelty. The methods of the first category introduce soft reversals, yielding signals that support cladograms incompatible with true phylogenies. We conclude that coding the plesiomorphy is the method to be preferred, unless the ancestral state is unknown, in which case coding as ambiguous is recommended. This holds for coding polymorphism in species as well as in supraspecific taxa. In this light we remark on methods proposed by previous authors. PMID:12066713

  16. Best practices for code release

    NASA Astrophysics Data System (ADS)

    Berriman, G. Bruce

    2016-01-01

    In this talk, I want to describe what I think are the best practices for releasing code and having it adopted by end users. Make sure your code is licensed, so users will know how the software can be used and modified, and place your code in a public repository that (and make sure that you follow institutional policies in doing this). Yet licensing and releasing code are not enough: the code must be organized and documented so users can understand what it does, what its limitations are, and how to build and use it. I will describe what I think are best practices in developing the content to support release, including tutorials, design documents, specifications of interfaces and so on. Much of what I have learned on based on ten years of experience in supporting releases of the Montage Image Mosaic Engine.

  17. The detection and extraction of interleaved code segments

    NASA Technical Reports Server (NTRS)

    Rugaber, Spencer; Stirewalt, Kurt; Wills, Linda M.

    1995-01-01

    This project is concerned with a specific difficulty that arises when trying to understand and modify computer programs. In particular, it is concerned with the phenomenon of 'interleaving' in which one section of a program accomplishes several purposes, and disentangling the code responsible for each purposes is difficult. Unraveling interleaved code involves discovering the purpose of each strand of computation, as well as understanding why the programmer decided to interleave the strands. Increased understanding improve the productivity and quality of software maintenance, enhancement, and documentation activities. It is the goal of the project to characterize the phenomenon of interleaving as a prerequisite for building tools to detect and extract interleaved code fragments.

  18. High Energy Particle Transport Code System.

    Energy Science and Technology Software Center (ESTSC)

    2003-12-17

    Version 00 NMTC/JAM is an upgraded version of the code CCC-694/NMTC-JAERI97, which was developed in 1982 at JAERI and is based on the CCC-161/NMTC code system. NMTC/JAM simulates high energy nuclear reactions and nuclear meson transport processes. The applicable energy range of NMTC/JAM was extended in principle up to 200 GeV for nucleons and mesons by introducing the high energy nuclear reaction code Jet-Aa Microscopic (JAM) for the intra-nuclear cascade part. For the evaporation andmore » fission process, a new model, GEM, can be used to describe the light nucleus production from the excited residual nucleus. According to the extension of the applicable energy, the nucleon-nucleus non-elastic, elastic and differential elastic cross section data were upgraded. In addition, the particle transport in a magnetic field was implemented for beam transport calculations. Some new tally functions were added, and the format of input and output of data is more user friendly. These new calculation functions and utilities provide a tool to carry out reliable neutronics study of a large scale target system with complex geometry more accurately and easily than with the previous model. It implements an intranuclear cascade model taking account of the in-medium nuclear effects and the preequilibrium calculation model based on the exciton one. For treating the nucleon transport process, the nucleon-nucleus cross sections are revised to those derived by the systematics of Pearlstein. Moreover, the level density parameter derived by Ignatyuk is included as a new option for particle evaporation calculation. A geometry package based on the Combinatorial Geometry with multi-array system and the importance sampling technique is implemented in the code. Tally function is also employed for obtaining such physical quantities as neutron energy spectra, heat deposition and nuclide yield without editing a history file. The code can simulate both the primary spallation reaction and the

  19. Code-modulated interferometric imaging system using phased arrays

    NASA Astrophysics Data System (ADS)

    Chauhan, Vikas; Greene, Kevin; Floyd, Brian

    2016-05-01

    Millimeter-wave (mm-wave) imaging provides compelling capabilities for security screening, navigation, and bio- medical applications. Traditional scanned or focal-plane mm-wave imagers are bulky and costly. In contrast, phased-array hardware developed for mass-market wireless communications and automotive radar promise to be extremely low cost. In this work, we present techniques which can allow low-cost phased-array receivers to be reconfigured or re-purposed as interferometric imagers, removing the need for custom hardware and thereby reducing cost. Since traditional phased arrays power combine incoming signals prior to digitization, orthogonal code-modulation is applied to each incoming signal using phase shifters within each front-end and two-bit codes. These code-modulated signals can then be combined and processed coherently through a shared hardware path. Once digitized, visibility functions can be recovered through squaring and code-demultiplexing operations. Pro- vided that codes are selected such that the product of two orthogonal codes is a third unique and orthogonal code, it is possible to demultiplex complex visibility functions directly. As such, the proposed system modulates incoming signals but demodulates desired correlations. In this work, we present the operation of the system, a validation of its operation using behavioral models of a traditional phased array, and a benchmarking of the code-modulated interferometer against traditional interferometer and focal-plane arrays.

  20. BASS Code Development

    NASA Technical Reports Server (NTRS)

    Sawyer, Scott

    2004-01-01

    The BASS computational aeroacoustic code solves the fully nonlinear Euler equations in the time domain in two-dimensions. The acoustic response of the stator is determined simultaneously for the first three harmonics of the convected vortical gust of the rotor. The spatial mode generation, propagation and decay characteristics are predicted by assuming the acoustic field away from the stator can be represented as a uniform flow with small harmonic perturbations superimposed. The computed field is then decomposed using a joint temporal-spatial transform to determine the wave amplitudes as a function of rotor harmonic and spatial mode order. This report details the following technical aspects of the computations and analysis. 1) the BASS computational technique; 2) the application of periodic time shifted boundary conditions; 3) the linear theory aspects unique to rotor-stator interactions; and 4) the joint spatial-temporal transform. The computational results presented herein are twofold. In each case, the acoustic response of the stator is determined simultaneously for the first three harmonics of the convected vortical gust of the rotor. The fan under consideration here like modern fans is cut-off at +, and propagating acoustic waves are only expected at 2BPF and 3BPF. In the first case, the computations showed excellent agreement with linear theory predictions. The frequency and spatial mode order of acoustic field was computed and found consistent with linear theory. Further, the propagation of the generated modes was also correctly predicted. The upstream going waves propagated from the domain without reflection from the in ow boundary. However, reflections from the out ow boundary were noticed. The amplitude of the reflected wave was approximately 5% of the incident wave. The second set of computations were used to determine the influence of steady loading on the generated noise. Toward this end, the acoustic response was determined with three steady loading

  1. Applications. Self-Checking Codes--An Application of Modular Arithmetic.

    ERIC Educational Resources Information Center

    Wood, Eric F.

    1987-01-01

    Two relevant applications of mathematics and computer science are presented: International Standard Book Numbers (ISBN) that appear in textbooks and universal product codes that appear on grocery products. A computer program is included. (MNS)

  2. Coded communications with nonideal interleaving

    NASA Astrophysics Data System (ADS)

    Laufer, Shaul

    1991-02-01

    Burst error channels - a type of block interference channels - feature increasing capacity but decreasing cutoff rate as the memory rate increases. Despite the large capacity, there is degradation in the performance of practical coding schemes when the memory length is excessive. A short-coding error parameter (SCEP) was introduced, which expresses a bound on the average decoding-error probability for codes shorter than the block interference length. The performance of a coded slow frequency-hopping communication channel is analyzed for worst-case partial band jamming and nonideal interleaving, by deriving expressions for the capacity and cutoff rate. The capacity and cutoff rate, respectively, are shown to approach and depart from those of a memoryless channel corresponding to the transmission of a single code letter per hop. For multiaccess communications over a slot-synchronized collision channel without feedback, the channel was considered as a block interference channel with memory length equal to the number of letters transmitted in each slot. The effects of an asymmetrical background noise and a reduced collision error rate were studied, as aspects of real communications. The performance of specific convolutional and Reed-Solomon codes was examined for slow frequency-hopping systems with nonideal interleaving. An upper bound is presented for the performance of a Viterbi decoder for a convolutional code with nonideal interleaving, and a soft decision diversity combining technique is introduced.

  3. ETR/ITER systems code

    SciTech Connect

    Barr, W.L.; Bathke, C.G.; Brooks, J.N.; Bulmer, R.H.; Busigin, A.; DuBois, P.F.; Fenstermacher, M.E.; Fink, J.; Finn, P.A.; Galambos, J.D.; Gohar, Y.; Gorker, G.E.; Haines, J.R.; Hassanein, A.M.; Hicks, D.R.; Ho, S.K.; Kalsi, S.S.; Kalyanam, K.M.; Kerns, J.A.; Lee, J.D.; Miller, J.R.; Miller, R.L.; Myall, J.O.; Peng, Y-K.M.; Perkins, L.J.; Spampinato, P.T.; Strickler, D.J.; Thomson, S.L.; Wagner, C.E.; Willms, R.S.; Reid, R.L.

    1988-04-01

    A tokamak systems code capable of modeling experimental test reactors has been developed and is described in this document. The code, named TETRA (for Tokamak Engineering Test Reactor Analysis), consists of a series of modules, each describing a tokamak system or component, controlled by an optimizer/driver. This code development was a national effort in that the modules were contributed by members of the fusion community and integrated into a code by the Fusion Engineering Design Center. The code has been checked out on the Cray computers at the National Magnetic Fusion Energy Computing Center and has satisfactorily simulated the Tokamak Ignition/Burn Experimental Reactor II (TIBER) design. A feature of this code is the ability to perform optimization studies through the use of a numerical software package, which iterates prescribed variables to satisfy a set of prescribed equations or constraints. This code will be used to perform sensitivity studies for the proposed International Thermonuclear Experimental Reactor (ITER). 22 figs., 29 tabs.

  4. Coding design for error correcting output codes based on perceptron

    NASA Astrophysics Data System (ADS)

    Zhou, Jin-Deng; Wang, Xiao-Dan; Zhou, Hong-Jian; Cui, Yong-Hua; Jing, Sun

    2012-05-01

    It is known that error-correcting output codes (ECOC) is a common way to model multiclass classification problems, in which the research of encoding based on data is attracting more and more attention. We propose a method for learning ECOC with the help of a single-layered perception neural network. To achieve this goal, the code elements of ECOC are mapped to the weights of network for the given decoding strategy, and an object function with the constrained weights is used as a cost function of network. After the training, we can obtain a coding matrix including lots of subgroups of class. Experimental results on artificial data and University of California Irvine with logistic linear classifier and support vector machine as the binary learner show that our scheme provides better performance of classification with shorter length of coding matrix than other state-of-the-art encoding strategies.

  5. PANEL CODE FOR PLANAR CASCADES

    NASA Technical Reports Server (NTRS)

    Mcfarland, E. R.

    1994-01-01

    The Panel Code for Planar Cascades was developed as an aid for the designer of turbomachinery blade rows. The effective design of turbomachinery blade rows relies on the use of computer codes to model the flow on blade-to-blade surfaces. Most of the currently used codes model the flow as inviscid, irrotational, and compressible with solutions being obtained by finite difference or finite element numerical techniques. While these codes can yield very accurate solutions, they usually require an experienced user to manipulate input data and control parameters. Also, they often limit a designer in the types of blade geometries, cascade configurations, and flow conditions that can be considered. The Panel Code for Planar Cascades accelerates the design process and gives the designer more freedom in developing blade shapes by offering a simple blade-to-blade flow code. Panel, or integral equation, solution techniques have been used for several years by external aerodynamicists who have developed and refined them into a primary design tool of the aircraft industry. The Panel Code for Planar Cascades adapts these same techniques to provide a versatile, stable, and efficient calculation scheme for internal flow. The code calculates the compressible, inviscid, irrotational flow through a planar cascade of arbitrary blade shapes. Since the panel solution technique is for incompressible flow, a compressibility correction is introduced to account for compressible flow effects. The analysis is limited to flow conditions in the subsonic and shock-free transonic range. Input to the code consists of inlet flow conditions, blade geometry data, and simple control parameters. Output includes flow parameters at selected control points. This program is written in FORTRAN IV for batch execution and has been implemented on an IBM 370 series computer with a central memory requirement of approximately 590K of 8 bit bytes. This program was developed in 1982.

  6. State building energy codes status

    SciTech Connect

    1996-09-01

    This document contains the State Building Energy Codes Status prepared by Pacific Northwest National Laboratory for the U.S. Department of Energy under Contract DE-AC06-76RL01830 and dated September 1996. The U.S. Department of Energy`s Office of Codes and Standards has developed this document to provide an information resource for individuals interested in energy efficiency of buildings and the relevant building energy codes in each state and U.S. territory. This is considered to be an evolving document and will be updated twice a year. In addition, special state updates will be issued as warranted.

  7. Understanding the Code: upholding dignity.

    PubMed

    Griffith, Richard

    2015-04-01

    The Nursing and Midwifery Council, the statutory professional regulator for registered district nurses, has introduced a revised code of standards that came into effect on 31 March 2015. The Code makes clear that while district nurses can interpret the values and principles for use in community settings, the standards are not negotiable or discretionary. They must be applied, otherwise the district nurse's fitness to practice will be called into question. In the second of a series of articles analysing the legal implications of the Code on district nurse practice, the author considers the first standard, which requires district nurses to treat people as individuals and to uphold their dignity. PMID:25839879

  8. A Radiation Shielding Code for Spacecraft and Its Validation

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Cucinotta, F. A.; Singleterry, R. C.; Wilson, J. W.; Badavi, F. F.; Badhwar, G. D.; Miller, J.; Zeitlin, C.; Heilbronn, L.; Tripathi, R. K.

    2000-01-01

    The HZETRN code, which uses a deterministic approach pioneered at NASA Langley Research Center, has been developed over the past decade to evaluate the local radiation fields within sensitive materials (electronic devices and human tissue) on spacecraft in the space environment. The code describes the interactions of shield materials with the incident galactic cosmic rays, trapped protons, or energetic protons from solar particle events in free space and low Earth orbit. The content of incident radiations is modified by atomic and nuclear reactions with the spacecraft and radiation shield materials. High-energy heavy ions are fragmented into less massive reaction products, and reaction products are produced by direct knockout of shield constituents or from de-excitation products. An overview of the computational procedures and database which describe these interactions is given. Validation of the code with recent Monte Carlo benchmarks, and laboratory and flight measurement is also included.

  9. Coding and traceability for cells, tissues and organs for transplantation.

    PubMed

    Strong, D Michael; Shinozaki, Naoshi

    2010-11-01

    Modern transplantation of cells, tissues and organs has been practiced within the last century achieving both life saving and enhancing results. Associated risks have been recognized including infectious disease transmission, malignancy, immune mediated disease and graft failure. This has resulted in establishment of government regulation, professional standard setting and establishment of vigilance and surveillance systems for early detection and prevention and to improve patient safety. The increased transportation of grafts across national boundaries has made traceability difficult and sometimes impossible. Experience during the first Gulf War with mis-identification of blood units coming from multiple countries without standardized coding and labeling has led international organizations to develop standardized nomenclature and coding for blood. Following this example, cell therapy and tissue transplant practitioners have also moved to standardization of coding systems. Establishment of an international coding system has progressed rapidly and implementation for blood has demonstrated multiple advantages. WHO has held two global consultations on human cells and tissues for transplantation, which recognized the global circulation of cells and tissues and growing commercialization and the need for means of coding to identify tissues and cells used in transplantation, are essential for full traceability. There is currently a wide diversity in the identification and coding of tissue and cell products. For tissues, with a few exceptions, product terminology has not been standardized even at the national level. Progress has been made in blood and cell therapies with a slow and steady trend towards implementation of the international code ISBT 128. Across all fields, there are now 3,700 licensed facilities in 66 countries. Efforts are necessary to encourage the introduction of a standardized international coding system for donation identification numbers, such as ISBT

  10. Coding and traceability for cells, tissues and organs for transplantation

    PubMed Central

    Shinozaki, Naoshi

    2010-01-01

    Modern transplantation of cells, tissues and organs has been practiced within the last century achieving both life saving and enhancing results. Associated risks have been recognized including infectious disease transmission, malignancy, immune mediated disease and graft failure. This has resulted in establishment of government regulation, professional standard setting and establishment of vigilance and surveillance systems for early detection and prevention and to improve patient safety. The increased transportation of grafts across national boundaries has made traceability difficult and sometimes impossible. Experience during the first Gulf War with miss-identification of blood units coming from multiple countries without standardized coding and labeling has led international organizations to develop standardized nomenclature and coding for blood. Following this example, cell therapy and tissue transplant practitioners have also moved to standardization of coding systems. Establishment of an international coding system has progressed rapidly and implementation for blood has demonstrated multiple advantages. WHO has held two global consultations on human cells and tissues for transplantation, which recognized the global circulation of cells and tissues and growing commercialization and the need for means of coding to identify tissues and cells used in transplantation, are essential for full traceability. There is currently a wide diversity in the identification and coding of tissue and cell products. For tissues, with a few exceptions, product terminology has not been standardized even at the national level. Progress has been made in blood and cell therapies with a slow and steady trend towards implementation of the international code ISBT 128. Across all fields, there are now 3,700 licensed facilities in 66 countries. Efforts are necessary to encourage the introduction of a standardized international coding system for donation identification numbers, such as ISBT

  11. Fatigue analysis codes for WECS components

    SciTech Connect

    Sutherland, H.J.; Ashwill, T.D.; Naassan, K.A.

    1987-10-01

    This Manuscript discusses two numerical techniques, the LIFE and the LIFE2 codes, that analyze the fatigue life of WECS components. The LIFE code is a PC-compatible Basic code that analyzes the fatigue life of a VAWT component. The LIFE2 code is a PC-compatible Fortran code that relaxes the rather restrictive assumptions of the LIFE code and permits the analysis of the fatigue life of all WECS components. Also, the modular format of the LIFE2 code permits the code to be revised, with minimal effort, to include additional analysis while maintaining its integrity. To illustrate the use of the codes, an example problem is presented. 10 refs.

  12. Enhancements to the STAGS computer code

    NASA Technical Reports Server (NTRS)

    Rankin, C. C.; Stehlin, P.; Brogan, F. A.

    1986-01-01

    The power of the STAGS family of programs was greatly enhanced. Members of the family include STAGS-C1 and RRSYS. As a result of improvements implemented, it is now possible to address the full collapse of a structural system, up to and beyond critical points where its resistance to the applied loads vanishes or suddenly changes. This also includes the important class of problems where a multiplicity of solutions exists at a given point (bifurcation), and where until now no solution could be obtained along any alternate (secondary) load path with any standard production finite element code.

  13. Bandwidth efficient coding for satellite communications

    NASA Astrophysics Data System (ADS)

    Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.

    1992-02-01

    An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.

  14. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  15. Bandwidth efficient coding for satellite communications

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Costello, Daniel J., Jr.; Miller, Warner H.; Morakis, James C.; Poland, William B., Jr.

    1992-01-01

    An error control coding scheme was devised to achieve large coding gain and high reliability by using coded modulation with reduced decoding complexity. To achieve a 3 to 5 dB coding gain and moderate reliability, the decoding complexity is quite modest. In fact, to achieve a 3 dB coding gain, the decoding complexity is quite simple, no matter whether trellis coded modulation or block coded modulation is used. However, to achieve coding gains exceeding 5 dB, the decoding complexity increases drastically, and the implementation of the decoder becomes very expensive and unpractical. The use is proposed of coded modulation in conjunction with concatenated (or cascaded) coding. A good short bandwidth efficient modulation code is used as the inner code and relatively powerful Reed-Solomon code is used as the outer code. With properly chosen inner and outer codes, a concatenated coded modulation scheme not only can achieve large coding gains and high reliability with good bandwidth efficiency but also can be practically implemented. This combination of coded modulation and concatenated coding really offers a way of achieving the best of three worlds, reliability and coding gain, bandwidth efficiency, and decoding complexity.

  16. PARAVT: Parallel Voronoi Tessellation code

    NASA Astrophysics Data System (ADS)

    Gonzalez, Roberto E.

    2016-01-01

    We present a new open source code for massive parallel computation of Voronoi tessellations(VT hereafter) in large data sets. The code is focused for astrophysical purposes where VT densities and neighbors are widely used. There are several serial Voronoi tessellation codes, however no open source and parallel implementations are available to handle the large number of particles/galaxies in current N-body simulations and sky surveys. Parallelization is implemented under MPI and VT using Qhull library. Domain decomposition take into account consistent boundary computation between tasks, and support periodic conditions. In addition, the code compute neighbors lists, Voronoi density and Voronoi cell volumes for each particle, and can compute density on a regular grid.

  17. Adaptive decoding of convolutional codes

    NASA Astrophysics Data System (ADS)

    Hueske, K.; Geldmacher, J.; Götze, J.

    2007-06-01

    Convolutional codes, which are frequently used as error correction codes in digital transmission systems, are generally decoded using the Viterbi Decoder. On the one hand the Viterbi Decoder is an optimum maximum likelihood decoder, i.e. the most probable transmitted code sequence is obtained. On the other hand the mathematical complexity of the algorithm only depends on the used code, not on the number of transmission errors. To reduce the complexity of the decoding process for good transmission conditions, an alternative syndrome based decoder is presented. The reduction of complexity is realized by two different approaches, the syndrome zero sequence deactivation and the path metric equalization. The two approaches enable an easy adaptation of the decoding complexity for different transmission conditions, which results in a trade-off between decoding complexity and error correction performance.

  18. Seals Flow Code Development 1993

    NASA Technical Reports Server (NTRS)

    Liang, Anita D. (Compiler); Hendricks, Robert C. (Compiler)

    1994-01-01

    Seals Workshop of 1993 code releases include SPIRALI for spiral grooved cylindrical and face seal configurations; IFACE for face seals with pockets, steps, tapers, turbulence, and cavitation; GFACE for gas face seals with 'lift pad' configurations; and SCISEAL, a CFD code for research and design of seals of cylindrical configuration. GUI (graphical user interface) and code usage was discussed with hands on usage of the codes, discussions, comparisons, and industry feedback. Other highlights for the Seals Workshop-93 include environmental and customer driven seal requirements; 'what's coming'; and brush seal developments including flow visualization, numerical analysis, bench testing, T-700 engine testing, tribological pairing and ceramic configurations, and cryogenic and hot gas facility brush seal results. Also discussed are seals for hypersonic engines and dynamic results for spiral groove and smooth annular seals.

  19. The moving mesh code SHADOWFAX

    NASA Astrophysics Data System (ADS)

    Vandenbroucke, B.; De Rijcke, S.

    2016-07-01

    We introduce the moving mesh code SHADOWFAX, which can be used to evolve a mixture of gas, subject to the laws of hydrodynamics and gravity, and any collisionless fluid only subject to gravity, such as cold dark matter or stars. The code is written in C++ and its source code is made available to the scientific community under the GNU Affero General Public Licence. We outline the algorithm and the design of our implementation, and demonstrate its validity through the results of a set of basic test problems, which are also part of the public version. We also compare SHADOWFAX with a number of other publicly available codes using different hydrodynamical integration schemes, illustrating the advantages and disadvantages of the moving mesh technique.

  20. Property Control through Bar Coding.

    ERIC Educational Resources Information Center

    Kingma, Gerben J.

    1984-01-01

    A public utility company uses laser wands to read bar-coded labels on furniture and equipment. The system allows an 80 percent savings of the time required to create reports for inventory control. (MLF)

  1. Improvements to the NASAP code

    NASA Technical Reports Server (NTRS)

    Perel, D.

    1980-01-01

    The FORTRAN code, NASAP was modified and improved for the capability of transforming the CAD-generated NASTRAN input data for DESAP II and/or DESAP I. The latter programs were developed for structural optimization.

  2. Tracking Code for Microwave Instability

    SciTech Connect

    Heifets, S.; /SLAC

    2006-09-21

    To study microwave instability the tracking code is developed. For bench marking, results are compared with Oide-Yokoya results [1] for broad-band Q = 1 impedance. Results hint to two possible mechanisms determining the threshold of instability.

  3. FLYCHK Collisional-Radiative Code

    National Institute of Standards and Technology Data Gateway

    SRD 160 FLYCHK Collisional-Radiative Code (Web, free access)   FLYCHK provides a capability to generate atomic level populations and charge state distributions for low-Z to mid-Z elements under NLTE conditions.

  4. Multiple-Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Simon, M. K.

    1990-01-01

    Theoretical gain over simple multiple-phase-shift keying at least 2 to 3 decibels. Multiple-trellis-coded modulation scheme combined with M-ary modulation shows theoretically to yield asymptotic gains in performance over uncoded multiple-phase-shift keying, while employing symmetric multiple-phase-shift signal constellations and avoiding code catastrophe. Suitable for satellite and terrestrial-mobile/satellite communications or other communications requiring burst-error correction. Extended to such higher dimensional modulations as quadrature amplitude modulation.

  5. Training course on code implementation.

    PubMed

    Allain, A; De Arango, R

    1992-01-01

    The International Baby Food Action Network (IBFAN) is a coalition of over 40 citizen groups in 70 countries. IBFAN monitors the progress worldwide of the implementation of the International Code of Marketing of Breastmilk Substitutes. The Code is intended to regulate the advertising and promotional techniques used to sell infant formula. The 1991 IBFAN report shows that 75 countries have taken some action to implement the International Code. During 1992, the IBFAN Code Documentation Center in Malaysia conducted 2 training courses to help countries draft legislation to implement and monitor compliance with the International Code. In April, government officials from 19 Asian and African countries attended the first course in Malaysia; the second course was conducted in Spanish in Guatemala and attended by officials from 15 Latin American and Caribbean countries. The resource people included representatives from NGOs in Africa, Asia, Latin America, Europe and North America with experience in Code implementation and monitoring at the national level. The main purpose of each course was to train government officials to use the International Code as a starting point for national legislation to protect breastfeeding. Participants reviewed recent information on lactation management, the advantages of breastfeeding, current trends in breastfeeding and the marketing practices of infant formula manufacturers. The participants studied the terminology contained in the International Code and terminology used by infant formula manufacturers to include breastmilk supplements such as follow-on formulas and cereal-based baby foods. Relevant World Health Assembly resolutions such as the one adopted in 1986 on the need to ban free and low-cost supplies to hospitals were examined. The legal aspects of the current Baby Friendly Hospital Initiative (BFHI) and the progress in the 12 BFHI test countries concerning the elimination of supplies were also examined. International Labor

  6. UNIX code management and distribution

    SciTech Connect

    Hung, T.; Kunz, P.F.

    1992-09-01

    We describe a code management and distribution system based on tools freely available for the UNIX systems. At the master site, version control is managed with CVS, which is a layer on top of RCS, and distribution is done via NFS mounted file systems. At remote sites, small modifications to CVS provide for interactive transactions with the CVS system at the master site such that remote developers are true peers in the code development process.

  7. Summary of Code of Ethics.

    PubMed

    Eklund, Kerri

    2016-01-01

    The Guide to the Code of Ethics for Nurses is an excellent guideline for all nurses regardless of their area of practice. I greatly enjoyed reading the revisions in place within the 2015 edition and refreshing my nursing conscience. I plan to always keep my Guide to the Code of Ethics for Nurses near in order to keep my moral compass from veering off the path of quality care. PMID:27183735

  8. Edge equilibrium code for tokamaks

    SciTech Connect

    Li, Xujing; Drozdov, Vladimir V.

    2014-01-15

    The edge equilibrium code (EEC) described in this paper is developed for simulations of the near edge plasma using the finite element method. It solves the Grad-Shafranov equation in toroidal coordinate and uses adaptive grids aligned with magnetic field lines. Hermite finite elements are chosen for the numerical scheme. A fast Newton scheme which is the same as implemented in the equilibrium and stability code (ESC) is applied here to adjust the grids.

  9. Computer-Access-Code Matrices

    NASA Technical Reports Server (NTRS)

    Collins, Earl R., Jr.

    1990-01-01

    Authorized users respond to changing challenges with changing passwords. Scheme for controlling access to computers defeats eavesdroppers and "hackers". Based on password system of challenge and password or sign, challenge, and countersign correlated with random alphanumeric codes in matrices of two or more dimensions. Codes stored on floppy disk or plug-in card and changed frequently. For even higher security, matrices of four or more dimensions used, just as cubes compounded into hypercubes in concurrent processing.

  10. electromagnetics, eddy current, computer codes

    Energy Science and Technology Software Center (ESTSC)

    2002-03-12

    TORO Version 4 is designed for finite element analysis of steady, transient and time-harmonic, multi-dimensional, quasi-static problems in electromagnetics. The code allows simulation of electrostatic fields, steady current flows, magnetostatics and eddy current problems in plane or axisymmetric, two-dimensional geometries. TORO is easily coupled to heat conduction and solid mechanics codes to allow multi-physics simulations to be performed.

  11. Rotating-Pump Design Code

    NASA Technical Reports Server (NTRS)

    Walker, James F.; Chen, Shu-Cheng; Scheer, Dean D.

    2006-01-01

    Pump Design (PUMPDES) is a computer program for designing a rotating pump for liquid hydrogen, liquid oxygen, liquid nitrogen, water, methane, or ethane. Using realistic properties of these fluids provided by another program called GASPAK, this code performs a station-by-station, mean-line analysis along the pump flow path, obtaining thermodynamic properties of the pumped fluid at each station and evaluating hydraulic losses along the flow path. The variables at each station are obtained under constraints that are consistent with the underlying physical principles. The code evaluates the performance of each stage and the overall pump. In addition, by judiciously choosing the givens and the unknowns, the code can perform a geometric inverse design function: that is, it can compute a pump geometry that yields a closest approximation of given design point. The code contains two major parts: one for an axial-rotor/inducer and one for a multistage centrifugal pump. The inducer and the centrifugal pump are functionally integrated. The code can be used in designing and/or evaluating the inducer/centrifugal-pump combination or the centrifugal pump alone. The code is written in standard Fortran 77.

  12. Spaceflight Validation of Hzetrn Code

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Shinn, J. L.; Singleterry, R. C.; Badavi, F. F.; Badhwar, G. D.; Reitz, G.; Beaujean, R.; Cucinotta, F. A.

    1999-01-01

    HZETRN is being developed as a fast deterministic radiation transport code applicable to neutrons, protons, and multiply charged ions in the space environment. It was recently applied to 50 hours of IMP8 data measured during the August 4, 1972 solar event to map the hourly exposures within the human body under several shield configurations. This calculation required only 18 hours on a VAX 4000 machine. A similar calculation using the Monte Carlo method would have required two years of dedicated computer time. The code has been benchmarked against well documented and tested Monte Carlo proton transport codes with good success. The code will allow important trade studies to be made with relative ease due to the computational speed and will be useful in assessing design alternatives in an integrated system software environment. Since there are no well tested Monte Carlo codes for HZE particles, we have been engaged in flight validation of the HZETRN results. To date we have made comparison with TEPC, CR-39, charge particle telescopes, and Bonner spheres. This broad range of detectors allows us to test a number of functions related to differing physical processes which add to the complicated radiation fields within a spacecraft or the human body, which functions can be calculated by the HZETRN code system. In the present report we will review these results.

  13. Leadership Class Configuration Interaction Code - Status and Opportunities

    NASA Astrophysics Data System (ADS)

    Vary, James

    2011-10-01

    With support from SciDAC-UNEDF (www.unedf.org) nuclear theorists have developed and are continuously improving a Leadership Class Configuration Interaction Code (LCCI) for forefront nuclear structure calculations. The aim of this project is to make state-of-the-art nuclear structure tools available to the entire community of researchers including graduate students. The project includes codes such as NuShellX, MFDn and BIGSTICK that run a range of computers from laptops to leadership class supercomputers. Codes, scripts, test cases and documentation have been assembled, are under continuous development and are scheduled for release to the entire research community in November 2011. A covering script that accesses the appropriate code and supporting files is under development. In addition, a Data Base Management System (DBMS) that records key information from large production runs and archived results of those runs has been developed (http://nuclear.physics.iastate.edu/info/) and will be released. Following an outline of the project, the code structure, capabilities, the DBMS and current efforts, I will suggest a path forward that would benefit greatly from a significant partnership between researchers who use the codes, code developers and the National Nuclear Data efforts. This research is supported in part by DOE under grant DE-FG02-87ER40371 and grant DE-FC02-09ER41582 (SciDAC-UNEDF).

  14. The Numerical Electromagnetics Code (NEC) - A Brief History

    SciTech Connect

    Burke, G J; Miller, E K; Poggio, A J

    2004-01-20

    The Numerical Electromagnetics Code, NEC as it is commonly known, continues to be one of the more widely used antenna modeling codes in existence. With several versions in use that reflect different levels of capability and availability, there are now 450 copies of NEC4 and 250 copies of NEC3 that have been distributed by Lawrence Livermore National Laboratory to a limited class of qualified recipients, and several hundred copies of NEC2 that had a recorded distribution by LLNL. These numbers do not account for numerous copies (perhaps 1000s) that were acquired through other means capitalizing on the open source code, the absence of distribution controls prior to NEC3 and the availability of versions on the Internet. In this paper we briefly review the history of the code that is concisely displayed in Figure 1. We will show how it capitalized on the research of prominent contributors in the early days of computational electromagnetics, how a combination of events led to the tri-service-supported code development program that ultimately led to NEC and how it evolved to the present day product. The authors apologize that space limitations do not allow us to provide a list of references or to acknowledge the numerous contributors to the code both of which can be found in the code documents.

  15. ALEPH2 - A general purpose Monte Carlo depletion code

    SciTech Connect

    Stankovskiy, A.; Van Den Eynde, G.; Baeten, P.; Trakas, C.; Demy, P. M.; Villatte, L.

    2012-07-01

    The Monte-Carlo burn-up code ALEPH is being developed at SCK-CEN since 2004. A previous version of the code implemented the coupling between the Monte Carlo transport (any version of MCNP or MCNPX) and the ' deterministic' depletion code ORIGEN-2.2 but had important deficiencies in nuclear data treatment and limitations inherent to ORIGEN-2.2. A new version of the code, ALEPH2, has several unique features making it outstanding among other depletion codes. The most important feature is full data consistency between steady-state Monte Carlo and time-dependent depletion calculations. The last generation general-purpose nuclear data libraries (JEFF-3.1.1, ENDF/B-VII and JENDL-4) are fully implemented, including special purpose activation, spontaneous fission, fission product yield and radioactive decay data. The built-in depletion algorithm allows to eliminate the uncertainties associated with obtaining the time-dependent nuclide concentrations. A predictor-corrector mechanism, calculation of nuclear heating, calculation of decay heat, decay neutron sources are available as well. The validation of the code on the results of REBUS experimental program has been performed. The ALEPH2 has shown better agreement with measured data than other depletion codes. (authors)

  16. The design of trellis codes for fading channels

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Simon, Marvin K.

    1987-01-01

    The appropriate criterion for optimum trellis coded modulation design on the additive white Gaussian noise channel is maximization of the free Euclidean distance. When trellis coded modulation is used on a Rician fading channel with interleaving/deinterleaving, the design of the code for optimum performance is guided by other factors, in particular the length of the shortest error event path, and the product of branch distances (possibly normalized by the Euclidean distance of the path) along that path. Although maximum free distance (d sub free) is still an important consideration, it plays a less significant role the more severe the fading is on the channel. These considerations lead to the definition of a new distance measure for optimization of trellis codes transmitted over Rician fading channels. If no interleaving/deinterleaving is used, then once again the design of the trellis code is guided by maximizing d sub free. It is also shown that allowing for multiple symbols per trellis branch, i.e., multiple trellis coded modulation (MTCM), provides an additional degree of freedom for designing a code to meet the above optimization criteria on the fading channel. It is here where the MTCM technique exploits its full potential.

  17. Cleanup MAC and MBA code ATP

    SciTech Connect

    Russell, V.K.

    1994-10-17

    The K Basins Materials Accounting (MAC) and Material Balance (MBA) database system had some minor code cleanup performed to its code. This ATP describes how the code was to be tested to verify its correctness.

  18. Entanglement-assisted codeword stabilized quantum codes

    SciTech Connect

    Shin, Jeonghwan; Heo, Jun; Brun, Todd A.

    2011-12-15

    Entangled qubits can increase the capacity of quantum error-correcting codes based on stabilizer codes. In addition, by using entanglement quantum stabilizer codes can be construct from classical linear codes that do not satisfy the dual-containing constraint. We show that it is possible to construct both additive and nonadditive quantum codes using the codeword stabilized quantum code framework. Nonadditive codes may offer improved performance over the more common stabilizer codes. Like other entanglement-assisted codes, the encoding procedure acts only on the qubits on Alice's side, and only these qubits are assumed to pass through the channel. However, errors in the codeword stabilized quantum code framework give rise to effective Z errors on Bob's side. We use this scheme to construct entanglement-assisted nonadditive quantum codes, in particular, ((5,16,2;1)) and ((7,4,5;4)) codes.

  19. Discrete Cosine Transform Image Coding With Sliding Block Codes

    NASA Astrophysics Data System (ADS)

    Divakaran, Ajay; Pearlman, William A.

    1989-11-01

    A transform trellis coding scheme for images is presented. A two dimensional discrete cosine transform is applied to the image followed by a search on a trellis structured code. This code is a sliding block code that utilizes a constrained size reproduction alphabet. The image is divided into blocks by the transform coding. The non-stationarity of the image is counteracted by grouping these blocks in clusters through a clustering algorithm, and then encoding the clusters separately. Mandela ordered sequences are formed from each cluster i.e identically indexed coefficients from each block are grouped together to form one dimensional sequences. A separate search ensues on each of these Mandela ordered sequences. Padding sequences are used to improve the trellis search fidelity. The padding sequences absorb the error caused by the building up of the trellis to full size. The simulations were carried out on a 256x256 image ('LENA'). The results are comparable to any existing scheme. The visual quality of the image is enhanced considerably by the padding and clustering.

  20. NREL Supports Development of New National Code for Hydrogen Technologies (Fact Sheet)

    SciTech Connect

    Not Available

    2010-12-01

    On December 14, 2010, the National Fire Protection Association (NFPA) issued a new national code for hydrogen technologies - NFPA 2 Hydrogen Technologies Code - which covers critical applications and operations such as hydrogen dispensing, production, and storage. The new code consolidates existing hydrogen-related NFPA codes and standards requirements into a single document and also introduces new requirements. This consolidation makes it easier for users to prepare code-compliant permit applications and to review/approve these applications. The National Renewable Energy Laboratory helped support the development of NFPA 2 on behalf of the U.S. Department of Energy Fuel Cell Technologies Program.

  1. SCDAP/RELAP5/MOD3 code development

    SciTech Connect

    Allison, C.M.; Siefken, J.L.; Coryell, E.W.

    1992-01-01

    The SCOAP/RELAP5/MOD3 computer code is designed to describe the overall reactor coolant system (RCS) thermal-hydraulic response, core damage progression, and fission product release and transport during severe accidents. The code is being developed at the Idaho National Engineering Laboratory (INEL) under the primary sponsorship of the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC). Code development activities are currently focused on three main areas - (a) code usability, (b) early phase melt progression model improvements, and (c) advanced reactor thermal-hydraulic model extensions. This paper describes the first two activities. A companion paper describes the advanced reactor model improvements being performed under RELAP5/MOD3 funding.

  2. SCDAP/RELAP5/MOD3 code development

    SciTech Connect

    Allison, C.M.; Siefken, J.L.; Coryell, E.W.

    1992-12-31

    The SCOAP/RELAP5/MOD3 computer code is designed to describe the overall reactor coolant system (RCS) thermal-hydraulic response, core damage progression, and fission product release and transport during severe accidents. The code is being developed at the Idaho National Engineering Laboratory (INEL) under the primary sponsorship of the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission (NRC). Code development activities are currently focused on three main areas - (a) code usability, (b) early phase melt progression model improvements, and (c) advanced reactor thermal-hydraulic model extensions. This paper describes the first two activities. A companion paper describes the advanced reactor model improvements being performed under RELAP5/MOD3 funding.

  3. Analysis of quantum error-correcting codes: Symplectic lattice codes and toric codes

    NASA Astrophysics Data System (ADS)

    Harrington, James William

    Quantum information theory is concerned with identifying how quantum mechanical resources (such as entangled quantum states) can be utilized for a number of information processing tasks, including data storage, computation, communication, and cryptography. Efficient quantum algorithms and protocols have been developed for performing some tasks (e.g. , factoring large numbers, securely communicating over a public channel, and simulating quantum mechanical systems) that appear to be very difficult with just classical resources. In addition to identifying the separation between classical and quantum computational power, much of the theoretical focus in this field over the last decade has been concerned with finding novel ways of encoding quantum information that are robust against errors, which is an important step toward building practical quantum information processing devices. In this thesis I present some results on the quantum error-correcting properties of oscillator codes (also described as symplectic lattice codes) and toric codes. Any harmonic oscillator system (such as a mode of light) can be encoded with quantum information via symplectic lattice codes that are robust against shifts in the system's continuous quantum variables. I show the existence of lattice codes whose achievable rates match the one-shot coherent information over the Gaussian quantum channel. Also, I construct a family of symplectic self-dual lattices and search for optimal encodings of quantum information distributed between several oscillators. Toric codes provide encodings of quantum information into two-dimensional spin lattices that are robust against local clusters of errors and which require only local quantum operations for error correction. Numerical simulations of this system under various error models provide a calculation of the accuracy threshold for quantum memory using toric codes, which can be related to phase transitions in certain condensed matter models. I also present

  4. NASA Rotor 37 CFD Code Validation: Glenn-HT Code

    NASA Technical Reports Server (NTRS)

    Ameri, Ali A.

    2010-01-01

    In order to advance the goals of NASA aeronautics programs, it is necessary to continuously evaluate and improve the computational tools used for research and design at NASA. One such code is the Glenn-HT code which is used at NASA Glenn Research Center (GRC) for turbomachinery computations. Although the code has been thoroughly validated for turbine heat transfer computations, it has not been utilized for compressors. In this work, Glenn-HT was used to compute the flow in a transonic compressor and comparisons were made to experimental data. The results presented here are in good agreement with this data. Most of the measures of performance are well within the measurement uncertainties and the exit profiles of interest agree with the experimental measurements.

  5. New multilevel codes over GF(q)

    NASA Technical Reports Server (NTRS)

    Wu, Jiantian; Costello, Daniel J., Jr.

    1992-01-01

    Set partitioning to multi-dimensional signal spaces over GF(q), particularly GF sup q-1(q) and GF sup q (q), and show how to construct both multi-level block codes and multi-level trellis codes over GF(q). Two classes of multi-level (n, k, d) block codes over GF(q) with block length n, number of information symbols k, and minimum distance d sub min greater than or = d, are presented. These two classes of codes use Reed-Solomon codes as component codes. They can be easily decoded as block length q-1 Reed-Solomon codes or block length q or q + 1 extended Reed-Solomon codes using multi-stage decoding. Many of these codes have larger distances than comparable q-ary block codes, as component codes. Low rate q-ary convolutional codes, work error correcting convolutional codes, and binary-to-q-ary convolutional codes can also be used to construct multi-level trellis codes over GF(q) or binary-to-q-ary trellis codes, some of which have better performance than the above block codes. All of the new codes have simple decoding algorithms based on hard decision multi-stage decoding.

  6. Recent developments of the CEM2K and LAQGSM codes.

    SciTech Connect

    Mashnik, S. G.; Gudima, K. K.; Sierk, A. J.

    2002-01-01

    Recent developments of the Cascade-Exciton Model (CEM) of nuclear reactions are briefly described. The improved cascade-exciton model as implemented in the code CEM97 [l] differs from the CEM95 version [4] by incorporating new approximations for the elementary cross sections used in the cascade, using more precise {approx} values for nuclear masses and pairing energies, using corrected systematics for the I level-density parameters, and several other refinements. Algorithms used in many i subroutines have been improved, decreasing the computing time by up to a factor of 6 for heavy targets. A number of further recent improvements and changes to CEM97, motivated by new data on isotope production measured at GSI will be presented. This leads us to CEM2k [2], a new version of the CEM code. CEM2k has a longer cascade stage, less preequilibrium emission, and evaporation from more highly excited compound nuclei compared to earlier versions. CEM2k also has other improvements and allows us to better model neutron, radionuclide, and gas production in Accelerator Transmutation of nuclear Wastes (ATW) spallation targets. The improved CEM97 code was recently used both to study fundamental nuclear physics problems like the role of nuclear medium effects in transport of 7r mesons in nuclei [5] and fission processes at intermediate energies [6], and was incorporated in the well known transport code MCNPX (LANL) to solve applied problems. The CEM95 version [4] of the CEM was incorporated in the MARS (FNAL) and CALOR95 (ORNL) transport codes, and its preequilibrium part was incorporated in many other transport codes like GEANT4 (CERN, see, e.g., [7]), HETC-3STEP (JAERI), HADRON (IHEP, Protvino), CASCADE (JINR, Dubna), SONET (RPCPI, Minsk), etc. The latest version of the CEM code, CEM2k, is still under development. The increased accuracy and predictive power of the code CEM2k will be shown by several examples. Further necessary work will be outlined.

  7. Blurring in bar code signals

    NASA Astrophysics Data System (ADS)

    Tang, Hong

    1997-10-01

    When a bar code symbol is passed over a scanner, it is struck across by a fast moving laser beam. The laser light is scattered by the bar code. The total scattered power is modulated by the reflectivity of the bars and spaces in the symbol. A fraction of the scattered light is collected and focused onto a photodetector that converts the light variation into an electronic signal. The electronic signal is then digitized for analysis by a computer. The scanning and detection process can be modeled by a convolution of the laser beam profile and the bar code reflectivity function. The switching between states in the digitized bar code signal, which represents transitions from a space to a bar or vice versa, is determined by a zero-crossing point in the second derivative of the analog signal. The laser profile acts like a smoothing function. It blurs the analog electronic signal. If the width of the laser profile is less than the minimum width of bars and spaces in the bar code reflectivity function, the transition point is not affected by the location of its neighboring edges. If the laser profile is wider than the minimum width in the bar code, the transition point can be shifted due to the locations of its neighboring edges. The behavior of the shift of transition is analyzed here for all cases in a UPC symbol. It is found that the amount of shift in the transition point is almost the same for several different cases within the depth of field of the scanner. The knowledge of the behavior of transition point shift can be used to accurately compensate printing errors in an over-printed bar code. The modulation transfer function (MTF) of bar code scanning is the Fourier transform of the marginal function of the scanning laser beam. The MTF through focus for a scanning system is presented. By using an aperture with central obscuration in the laser focusing system, the high frequency resolution of bar code scanning can be enhanced and the depth of field of the scanner can

  8. International assessment of PCA codes

    SciTech Connect

    Neymotin, L.; Lui, C.; Glynn, J.; Archarya, S.

    1993-11-01

    Over the past three years (1991-1993), an extensive international exercise for intercomparison of a group of six Probabilistic Consequence Assessment (PCA) codes was undertaken. The exercise was jointly sponsored by the Commission of European Communities (CEC) and OECD Nuclear Energy Agency. This exercise was a logical continuation of a similar effort undertaken by OECD/NEA/CSNI in 1979-1981. The PCA codes are currently used by different countries for predicting radiological health and economic consequences of severe accidents at nuclear power plants (and certain types of non-reactor nuclear facilities) resulting in releases of radioactive materials into the atmosphere. The codes participating in the exercise were: ARANO (Finland), CONDOR (UK), COSYMA (CEC), LENA (Sweden), MACCS (USA), and OSCAAR (Japan). In parallel with this inter-code comparison effort, two separate groups performed a similar set of calculations using two of the participating codes, MACCS and COSYMA. Results of the intercode and inter-MACCS comparisons are presented in this paper. The MACCS group included four participants: GREECE: Institute of Nuclear Technology and Radiation Protection, NCSR Demokritos; ITALY: ENEL, ENEA/DISP, and ENEA/NUC-RIN; SPAIN: Universidad Politecnica de Madrid (UPM) and Consejo de Seguridad Nuclear; USA: Brookhaven National Laboratory, US NRC and DOE.

  9. Driver Code for Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Rao, Shanti

    2007-01-01

    A special-purpose computer code for a deformable-mirror adaptive-optics control system transmits pixel-registered control from (1) a personal computer running software that generates the control data to (2) a circuit board with 128 digital-to-analog converters (DACs) that generate voltages to drive the deformable-mirror actuators. This program reads control-voltage codes from a text file, then sends them, via the computer s parallel port, to a circuit board with four AD5535 (or equivalent) chips. Whereas a similar prior computer program was capable of transmitting data to only one chip at a time, this program can send data to four chips simultaneously. This program is in the form of C-language code that can be compiled and linked into an adaptive-optics software system. The program as supplied includes source code for integration into the adaptive-optics software, documentation, and a component that provides a demonstration of loading DAC codes from a text file. On a standard Windows desktop computer, the software can update 128 channels in 10 ms. On Real-Time Linux with a digital I/O card, the software can update 1024 channels (8 boards in parallel) every 8 ms.

  10. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  11. A genetic scale of reading frame coding.

    PubMed

    Michel, Christian J

    2014-08-21

    The reading frame coding (RFC) of codes (sets) of trinucleotides is a genetic concept which has been largely ignored during the last 50 years. A first objective is the definition of a new and simple statistical parameter PrRFC for analysing the probability (efficiency) of reading frame coding (RFC) of any trinucleotide code. A second objective is to reveal different classes and subclasses of trinucleotide codes involved in reading frame coding: the circular codes of 20 trinucleotides and the bijective genetic codes of 20 trinucleotides coding the 20 amino acids. This approach allows us to propose a genetic scale of reading frame coding which ranges from 1/3 with the random codes (RFC probability identical in the three frames) to 1 with the comma-free circular codes (RFC probability maximal in the reading frame and null in the two shifted frames). This genetic scale shows, in particular, the reading frame coding probabilities of the 12,964,440 circular codes (PrRFC=83.2% in average), the 216 C(3) self-complementary circular codes (PrRFC=84.1% in average) including the code X identified in eukaryotic and prokaryotic genes (PrRFC=81.3%) and the 339,738,624 bijective genetic codes (PrRFC=61.5% in average) including the 52 codes without permuted trinucleotides (PrRFC=66.0% in average). Otherwise, the reading frame coding probabilities of each trinucleotide code coding an amino acid with the universal genetic code are also determined. The four amino acids Gly, Lys, Phe and Pro are coded by codes (not circular) with RFC probabilities equal to 2/3, 1/2, 1/2 and 2/3, respectively. The amino acid Leu is coded by a circular code (not comma-free) with a RFC probability equal to 18/19. The 15 other amino acids are coded by comma-free circular codes, i.e. with RFC probabilities equal to 1. The identification of coding properties in some classes of trinucleotide codes studied here may bring new insights in the origin and evolution of the genetic code. PMID:24698943

  12. An algebraic hypothesis about the primeval genetic code architecture.

    PubMed

    Sánchez, Robersy; Grau, Ricardo

    2009-09-01

    A plausible architecture of an ancient genetic code is derived from an extended base triplet vector space over the Galois field of the extended base alphabet {D,A,C,G,U}, where symbol D represents one or more hypothetical bases with unspecific pairings. We hypothesized that the high degeneration of a primeval genetic code with five bases and the gradual origin and improvement of a primeval DNA repair system could make possible the transition from ancient to modern genetic codes. Our results suggest that the Watson-Crick base pairing G identical with C and A=U and the non-specific base pairing of the hypothetical ancestral base D used to define the sum and product operations are enough features to determine the coding constraints of the primeval and the modern genetic code, as well as, the transition from the former to the latter. Geometrical and algebraic properties of this vector space reveal that the present codon assignment of the standard genetic code could be induced from a primeval codon assignment. Besides, the Fourier spectrum of the extended DNA genome sequences derived from the multiple sequence alignment suggests that the called period-3 property of the present coding DNA sequences could also exist in the ancient coding DNA sequences. The phylogenetic analyses achieved with metrics defined in the N-dimensional vector space (B(3))(N) of DNA sequences and with the new evolutionary model presented here also suggest that an ancient DNA coding sequence with five or more bases does not contradict the expected evolutionary history. PMID:19607845

  13. Some partial-unit-memory convolutional codes

    NASA Technical Reports Server (NTRS)

    Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.

    1991-01-01

    The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.

  14. Turbo codes-based image transmission for channels with multiple types of distortion.

    PubMed

    Yao, Lei; Cao, Lei

    2008-11-01

    Product codes are generally used for progressive image transmission when random errors and packet loss (or burst errors) co-exist. However, the optimal rate allocation considering both component codes gives rise to high-optimization complexity. In addition, the decoding performance may be degraded quickly when the channel varies beyond the design point. In this paper, we propose a new unequal error protection (UEP) scheme for progressive image transmission by using rate-compatible punctured Turbo codes (RCPT) and cyclic redundancy check (CRC) codes only. By sophisticatedly interleaving each coded frame, the packet loss can be converted into randomly punctured bits in a Turbo code. Therefore, error control in noisy channels with different types of errors is equivalent to dealing with random bit errors only, with reduced turbo code rates. A genetic algorithm-based method is presented to further reduce the optimization complexity. This proposed method not only gives a better performance than product codes in given channel conditions but is also more robust to the channel variation. Finally, to break down the error floor of turbo decoding, we further extend the above RCPT/CRC protection to a product code scheme by adding a Reed-Solomon (RS) code across the frames. The associated rate allocation is discussed and further improvement is demonstrated. PMID:18854248

  15. An Euler code calculation of blade-vortex interaction noise

    NASA Technical Reports Server (NTRS)

    Hardin, J. C.; Lamkin, S. L.

    1987-01-01

    An Euler code has been developed for calculation of noise radiation due to the interaction of a distributed vortex with a Joukowski airfoil. THe time-dependent incompressible flow field is first determined and then integrated to yield the resulting sound production through use of the elegant low-frequency Green's function approach. This code has several interesting numerical features involved in the vortex motion and in continuous satisfaction of the Kutta condition. In addition, it removes the limitations on Reynolds number and is much more efficient than an earlier Navier-Stokes code. Results indicate that the noise production is due to the deceleration and subsequent acceleration of the vortex as it approaches and passes the airfoil. Predicted acoustic levels and frequencies agree with measured data although a precise comparison would require the strength, size, and position of the incoming vortex to be known.

  16. FLOWTRAN-TF code description

    SciTech Connect

    Flach, G.P.

    1991-09-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  17. FLOWTRAN-TF code benchmarking

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss Of Coolant Accident (LOCA). A description of the code is given by Flach et al. (1990). This report provides benchmarking results for the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit (Smith et al., 1990a; 1990b). Individual constitutive relations are benchmarked in Sections 2 through 5 while in Sections 6 and 7 integral code benchmarking results are presented. An overall assessment of FLOWTRAN-TF for its intended use in computing the ECS power limit completes the document.

  18. Code-multiplexed optical scanner

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.; Arain, Muzammil A.

    2003-03-01

    A three-dimensional (3-D) optical-scanning technique is proposed based on spatial optical phase code activation on an input beam. This code-multiplexed optical scanner (C-MOS) relies on holographically stored 3-D beam-forming information. Proof-of-concept C-MOS experimental results by use of a photorefractive crystal as a holographic medium generates eight beams representing a basic 3-D voxel element generated via a binary-code matrix of the Hadamard type. The experiment demonstrates the C-MOS features of no moving parts, beam-forming flexibility, and large centimeter-size apertures. A novel application of the C-MOS as an optical security lock is highlighted.

  19. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  20. Pulse code modulated signal synchronizer

    NASA Technical Reports Server (NTRS)

    Kobayashi, H. S. (Inventor)

    1974-01-01

    A bit synchronizer for a split phase PCM transmission is reported that includes three loop circuits which receive incoming phase coded PCM signals. In the first loop, called a Q-loop, a generated, phase coded, PCM signal is multiplied with the incoming signals, and the frequency and phase of the generated signal are nulled to that of the incoming subcarrier signal. In the second loop, called a B-loop, a circuit multiplies a generated signal with incoming signals to null the phase of the generated signal in a bit phase locked relationship to the incoming signal. In a third loop, called the I-loop, a phase coded PCM signal is multiplied with the incoming signals for decoding the bit information from the PCM signal. A counter means is used for timing of the generated signals and timing of sample intervals for each bit period.

  1. A coded tracking telemetry system

    USGS Publications Warehouse

    Howey, P.W.; Seegar, W.S.; Fuller, M.R.; Titus, K.

    1989-01-01

    We describe the general characteristics of an automated radio telemetry system designed to operate for prolonged periods on a single frequency. Each transmitter sends a unique coded signal to a receiving system that encodes and records only the appropriater, pre-programmed codes. A record of the time of each reception is stored on diskettes in a micro-computer. This system enables continuous monitoring of infrequent signals (e.g. one per minute or one per hour), thus extending operation life or allowing size reduction of the transmitter, compared to conventional wildlife telemetry. Furthermore, when using unique codes transmitted on a single frequency, biologists can monitor many individuals without exceeding the radio frequency allocations for wildlife.

  2. FLOWTRAN-TF code description

    SciTech Connect

    Flach, G.P.

    1990-12-01

    FLOWTRAN-TF is a two-component (air-water), two-phase thermal-hydraulics code designed for performing accident analyses of SRS reactor fuel assemblies during the Emergency Cooling System (ECS) phase of a Double Ended Guillotine Break (DEGB) Loss of Coolant Accident (LOCA). This report provides a brief description of the physical models in the version of FLOWTRAN-TF used to compute the Recommended K-Reactor Restart ECS Power Limit. This document is viewed as an interim report and should ultimately be superseded by a comprehensive user/programmer manual. In general, only high level discussions of governing equations and constitutive laws are presented. Numerical implementation of these models, code architecture and user information are not generally covered. A companion document describing code benchmarking is available.

  3. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2010-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  4. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2011-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  5. ASME Code Efforts Supporting HTGRs

    SciTech Connect

    D.K. Morton

    2012-09-01

    In 1999, an international collaborative initiative for the development of advanced (Generation IV) reactors was started. The idea behind this effort was to bring nuclear energy closer to the needs of sustainability, to increase proliferation resistance, and to support concepts able to produce energy (both electricity and process heat) at competitive costs. The U.S. Department of Energy has supported this effort by pursuing the development of the Next Generation Nuclear Plant, a high temperature gas-cooled reactor. This support has included research and development of pertinent data, initial regulatory discussions, and engineering support of various codes and standards development. This report discusses the various applicable American Society of Mechanical Engineers (ASME) codes and standards that are being developed to support these high temperature gascooled reactors during construction and operation. ASME is aggressively pursuing these codes and standards to support an international effort to build the next generation of advanced reactors so that all can benefit.

  6. Exceptional error minimization in putative primordial genetic codes

    PubMed Central

    2009-01-01

    Background The standard genetic code is redundant and has a highly non-random structure. Codons for the same amino acids typically differ only by the nucleotide in the third position, whereas similar amino acids are encoded, mostly, by codon series that differ by a single base substitution in the third or the first position. As a result, the code is highly albeit not optimally robust to errors of translation, a property that has been interpreted either as a product of selection directed at the minimization of errors or as a non-adaptive by-product of evolution of the code driven by other forces. Results We investigated the error-minimization properties of putative primordial codes that consisted of 16 supercodons, with the third base being completely redundant, using a previously derived cost function and the error minimization percentage as the measure of a code's robustness to mistranslation. It is shown that, when the 16-supercodon table is populated with 10 putative primordial amino acids, inferred from the results of abiotic synthesis experiments and other evidence independent of the code's evolution, and with minimal assumptions used to assign the remaining supercodons, the resulting 2-letter codes are nearly optimal in terms of the error minimization level. Conclusion The results of the computational experiments with putative primordial genetic codes that contained only two meaningful letters in all codons and encoded 10 to 16 amino acids indicate that such codes are likely to have been nearly optimal with respect to the minimization of translation errors. This near-optimality could be the outcome of extensive early selection during the co-evolution of the code with the primordial, error-prone translation system, or a result of a unique, accidental event. Under this hypothesis, the subsequent expansion of the code resulted in a decrease of the error minimization level that became sustainable owing to the evolution of a high-fidelity translation system

  7. Effects of bar coding on a pharmacy stock replenishment system.

    PubMed

    Chester, M I; Zilz, D A

    1989-07-01

    A bar-code stock ordering system installed in the ambulatory-care pharmacy and sterile products area of a hospital pharmacy was compared with a manual paper system to quantify overall time demands and determine the error rate associated with each system. The bar-code system was implemented in the ambulatory-care pharmacy in November 1987 and in the sterile products area in January 1988. It consists of a Trakker 9440 transaction manager with a digital scanner; labels are printed with a dot matrix printer. Electronic scanning of bar-code labels and entry of the amount required using the key-pad on the transaction manager replaced use of a preprinted form for ordering items. With the bar-code system, ordering information is transferred electronically via cable to the pharmacy inventory computer; with the manual system, this information was input by a stockroom technician. To compare the systems, the work of technicians in the ambulatory-care pharmacy and sterile products area was evaluated before and after implementation of the bar-code system. The time requirements for information gathering and data transfer were recorded by direct observation; the prevalence of errors under each system was determined by comparing unprocessed ordering information with the corresponding computer-generated "pick lists" (itemized lists including the amount of each product ordered). Time consumed in extra trips to the stockroom to replace out-of-stock items was self-reported. Significantly less time was required to order stock and transfer data to the pharmacy inventory computer with the bar-code system than with the manual system.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:2757044

  8. An Analysis of Syndrome Coding

    NASA Astrophysics Data System (ADS)

    Amiruzzaman, Md; Abdullah-Al-Wadud, M.; Chung, Yoojin

    In this paper a detail analysis is presented based on BCH syndrome coding for covert channel data hiding methods. The experimented technique is nothing but a syndrome coding algorithm with a coset based approach, analyzed results are showing that the examined method has more flexibility to choose coset, also providing less modification distortion caused by data hiding. Analyzed method presented by clear mathematical way. As it is mathematical equation dependent, hence analyzed results are showing that the analyzed method has fast computation ability and find perfect roots for modification.

  9. COLAcode: COmoving Lagrangian Acceleration code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin V.

    2016-02-01

    COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.

  10. Radio Losses for Concatenated Codes

    NASA Astrophysics Data System (ADS)

    Shambayati, S.

    2002-07-01

    The advent of higher powered spacecraft amplifiers and better ground receivers capable of tracking spacecraft carrier signals with narrower loop bandwidths requires better understanding of the carrier tracking loss (radio loss) mechanism of the concatenated codes used for deep-space missions. In this article, we present results of simulations performed for a (7,1/2), Reed-Solomon (255,223), interleaver depth-5 concatenated code in order to shed some light on this issue. Through these simulations, we obtained the performance of this code over an additive white Gaussian noise (AWGN) channel (the baseline performance) in terms of both its frame-error rate (FER) and its bit-error rate at the output of the Reed-Solomon decoder (RS-BER). After obtaining these results, we curve fitted the baseline performance curves for FER and RS-BER and calculated the high-rate radio losses for this code for an FER of 10^(-4) and its corresponding baseline RS-BER of 2.1 x 10^(-6) for a carrier loop signal-to-noise ratio (SNR) of 14.8 dB. This calculation revealed that even though over the AWGN channel the FER value and the RS-BER value correspond to each other (i.e., these values are obtained by the same bit SNR value), the RS-BER value has higher high-rate losses than does the FER value. Furthermore, this calculation contradicted the previous assumption th at at high data rates concatenated codes have the same radio losses as their constituent convolutional codes. Our results showed much higher losses for the FER and the RS-BER (by as much as 2 dB) than for the corresponding baseline BER of the convolutional code. Further simulations were performed to investigate the effects of changes in the data rate on the code's radio losses. It was observed that as the data rate increased the radio losses for both the FER and the RS-BER approached their respective calculated high-rate values. Furthermore, these simulations showed that a simple two-parameter function could model the increase in the

  11. Sensor Authentication: Embedded Processor Code

    SciTech Connect

    Svoboda, John

    2012-09-25

    Described is the c code running on the embedded Microchip 32bit PIC32MX575F256H located on the INL developed noise analysis circuit board. The code performs the following functions: Controls the noise analysis circuit board preamplifier voltage gains of 1, 10, 100, 000 Initializes the analog to digital conversion hardware, input channel selection, Fast Fourier Transform (FFT) function, USB communications interface, and internal memory allocations Initiates high resolution 4096 point 200 kHz data acquisition Computes complex 2048 point FFT and FFT magnitude. Services Host command set Transfers raw data to Host Transfers FFT result to host Communication error checking

  12. Signal Processing Expert Code (SPEC)

    SciTech Connect

    Ames, H.S.

    1985-12-01

    The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.

  13. Hybrid codes: Methods and applications

    SciTech Connect

    Winske, D. ); Omidi, N. )

    1991-01-01

    In this chapter we discuss hybrid'' algorithms used in the study of low frequency electromagnetic phenomena, where one or more ion species are treated kinetically via standard PIC methods used in particle codes and the electrons are treated as a single charge neutralizing massless fluid. Other types of hybrid models are possible, as discussed in Winske and Quest, but hybrid codes with particle ions and massless fluid electrons have become the most common for simulating space plasma physics phenomena in the last decade, as we discuss in this paper.

  14. The Impact of Codes of Conduct on Stakeholders

    ERIC Educational Resources Information Center

    Newman, Wayne R.

    2015-01-01

    The purpose of this study was to determine how an urban school district's code of conduct aligned with actual school/class behaviors, and how stakeholders perceived the ability of this document to achieve its number one goal: safe and productive learning environments. Twenty participants including students, teachers, parents, and administrators…

  15. 21 CFR 206.10 - Code imprint required.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL IMPRINTING OF SOLID ORAL DOSAGE FORM DRUG PRODUCTS FOR HUMAN USE § 206.10 Code imprint required... delivered for introduction into interstate commerce unless it is clearly marked or imprinted with a...

  16. 21 CFR 206.10 - Code imprint required.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL IMPRINTING OF SOLID ORAL DOSAGE FORM DRUG PRODUCTS FOR HUMAN USE § 206.10 Code imprint required... delivered for introduction into interstate commerce unless it is clearly marked or imprinted with a...

  17. 21 CFR 206.10 - Code imprint required.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL IMPRINTING OF SOLID ORAL DOSAGE FORM DRUG PRODUCTS FOR HUMAN USE § 206.10 Code imprint required... delivered for introduction into interstate commerce unless it is clearly marked or imprinted with a...

  18. Summary of papers on current and anticipated uses of thermal-hydraulic codes

    SciTech Connect

    Caruso, R.

    1997-07-01

    The author reviews a range of recent papers which discuss possible uses and future development needs for thermal/hydraulic codes in the nuclear industry. From this review, eight common recommendations are extracted. They are: improve the user interface so that more people can use the code, so that models are easier and less expensive to prepare and maintain, and so that the results are scrutable; design the code so that it can easily be coupled to other codes, such as core physics, containment, fission product behaviour during severe accidents; improve the numerical methods to make the code more robust and especially faster running, particularly for low pressure transients; ensure that future code development includes assessment of code uncertainties as integral part of code verification and validation; provide extensive user guidelines or structure the code so that the `user effect` is minimized; include the capability to model multiple fluids (gas and liquid phase); design the code in a modular fashion so that new models can be added easily; provide the ability to include detailed or simplified component models; build on work previously done with other codes (RETRAN, RELAP, TRAC, CATHARE) and other code validation efforts (CSAU, CSNI SET and IET matrices).

  19. Selected commensal-related bacteria and Toll-like receptor 3 agonist combinatorial codes synergistically induce interleukin-12 production by dendritic cells to trigger a T helper type 1 polarizing programme

    PubMed Central

    Baba, Nobuyasu; Samson, Sandrine; Bourdet-Sicard, Raphaëlle; Rubio, Manuel; Sarfati, Marika

    2009-01-01

    Enteric infections remain a major health problem causing millions of deaths in developing countries. The interplay among the host intestinal epithelium, the mucosa-associated immune system and microbiota performs an essential role in gut homeostasis and protection against infectious diseases. Dendritic cells (DCs) play a key role in orchestrating protective immunity and tolerance in the gut. The mechanisms by which DCs adapt their responses and discriminate between virulent microbes and trillions of innocuous bacteria remain ill-defined. Here we investigated the effect of cross-talk between commensal-related bacteria (CB) and Toll-like receptor (TLR) agonists on DC activation and the outcome of the in vitro T helper response. Human monocyte-derived DCs were exposed to eight different Gram-positive or Gram-negative CB strains prior to activation with five different TLR agonists. The key polarizing cytokines interleukin (IL)-12p70, IL-10, IL-1β and IL-6 were quantified and the fate of naïve T-cell differentiation was evaluated. We identified a unique combination of Lactobacillus casei and TLR3 signals that acted in synergy to selectively increase IL-12p70 secretion. Exposure to poly(I:C) converted L. casei-treated DCs into potent promoters of T helper type 1 (Th1) responses. We propose that DCs can integrate harmless and dangerous non-self signals delivered by viral products, to mount robust Th1 responses. Thus, in vivo DC targeting with selective probiotics may improve strategies for the management of enteric diseases. PMID:19740313

  20. Soft decision decoding of block codes

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.; Mceliece, R. J.

    1978-01-01

    The performance of certain block codes on a Gaussian channel is evaluated. The BCH codes are markedly superior to convolutional codes currently used for deep space missions. The algorithm is used to derive results, which provides a basis for a simple, almost optimum procedure for decoding these codes.

  1. On the Grammar of Code-Switching.

    ERIC Educational Resources Information Center

    Bhatt, Rakesh M.

    1996-01-01

    Explores an Optimality-Theoretic approach to account for observed cross-linguistic patterns of code switching that assumes that code switching strives for well-formedness. Optimization of well-formedness in code switching is shown to follow from (violable) ranked constraints. An argument is advanced that code-switching patterns emerge from…

  2. An Interactive Concatenated Turbo Coding System

    NASA Technical Reports Server (NTRS)

    Liu, Ye; Tang, Heng; Lin, Shu; Fossorier, Marc

    1999-01-01

    This paper presents a concatenated turbo coding system in which a Reed-Solomon outer code is concatenated with a binary turbo inner code. In the proposed system, the outer code decoder and the inner turbo code decoder interact to achieve both good bit error and frame error performances. The outer code decoder helps the inner turbo code decoder to terminate its decoding iteration while the inner turbo code decoder provides soft-output information to the outer code decoder to carry out a reliability-based soft- decision decoding. In the case that the outer code decoding fails, the outer code decoder instructs the inner code decoder to continue its decoding iterations until the outer code decoding is successful or a preset maximum number of decoding iterations is reached. This interaction between outer and inner code decoders reduces decoding delay. Also presented in the paper are an effective criterion for stopping the iteration process of the inner code decoder and a new reliability-based decoding algorithm for nonbinary codes.

  3. Trace-shortened Reed-Solomon codes

    NASA Technical Reports Server (NTRS)

    Mceliece, R. J.; Solomon, G.

    1994-01-01

    Reed-Solomon (RS) codes have been part of standard NASA telecommunications systems for many years. RS codes are character-oriented error-correcting codes, and their principal use in space applications has been as outer codes in concatenated coding systems. However, for a given character size, say m bits, RS codes are limited to a length of, at most, 2(exp m). It is known in theory that longer character-oriented codes would be superior to RS codes in concatenation applications, but until recently no practical class of 'long' character-oriented codes had been discovered. In 1992, however, Solomon discovered an extensive class of such codes, which are now called trace-shortened Reed-Solomon (TSRS) codes. In this article, we will continue the study of TSRS codes. Our main result is a formula for the dimension of any TSRS code, as a function of its error-correcting power. Using this formula, we will give several examples of TSRS codes, some of which look very promising as candidate outer codes in high-performance coded telecommunications systems.

  4. 7 CFR 201.28 - Code designation.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Code designation. 201.28 Section 201.28 Agriculture... REGULATIONS Labeling Vegetable Seeds § 201.28 Code designation. The code designation used in lieu of the full... as may be designated by him for the purpose. When used, the code designation shall appear on...

  5. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  6. Constructions of Asymmetric Quantum Alternant Codes

    NASA Astrophysics Data System (ADS)

    Fan, Jihao; Chen, Hanwu; Xu, Juan

    2016-01-01

    Asymmetric quantum error-correcting codes (AQCs) have been proposed to deal with the significant asymmetry in many quantum channels, which may have more flexbility than general quantum error-correcting codes (QECs). In this paper, we construct AQCs based on Alternant codes. Firstly, we propose a new subclass of Alternant codes and combine them with BCH codes to construct AQCs. Then we construct AQCs based on series of nested pairs of subclasses of Alternant codes such as nested Goppa codes. As an illustrative example, we get three [[55, 6, 19/4

  7. Convolutional coding combined with continuous phase modulation

    NASA Technical Reports Server (NTRS)

    Pizzi, S. V.; Wilson, S. G.

    1985-01-01

    Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.

  8. Entanglement-assisted quantum convolutional coding

    SciTech Connect

    Wilde, Mark M.; Brun, Todd A.

    2010-04-15

    We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.

  9. The analog linear interpolation approach for Monte Carlo simulation of PGNAA: The CEARPGA code

    NASA Astrophysics Data System (ADS)

    Zhang, Wenchao; Gardner, Robin P.

    2004-01-01

    The analog linear interpolation approach (ALI) has been developed and implemented to eliminate the big weight problem in the Monte Carlo simulation code CEARPGA. The CEARPGA code was previously developed to generate elemental library spectra for using the Monte Carlo - library least-squares (MCLLS) approach in prompt gamma-ray neutron activation analysis (PGNAA). In addition, some other improvements to this code have been introduced, including (1) adopting the latest photon cross-section data, (2) using an improved detector response function, (3) adding the neutron activation backgrounds, (4) generating the individual natural background libraries, (5) adding the tracking of annihilation photons from pair production interactions outside of the detector and (6) adopting a general geometry package. The simulated result from the new CEARPGA code is compared with those calculated from the previous CEARPGA code and the MCNP code and experimental data. The new CEARPGA code is found to give the best result.

  10. Accumulate-Repeat-Accumulate-Accumulate Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush; Dolinar, Samuel; Thorpe, Jeremy

    2007-01-01

    Accumulate-repeat-accumulate-accumulate (ARAA) codes have been proposed, inspired by the recently proposed accumulate-repeat-accumulate (ARA) codes. These are error-correcting codes suitable for use in a variety of wireless data-communication systems that include noisy channels. ARAA codes can be regarded as serial turbolike codes or as a subclass of low-density parity-check (LDPC) codes, and, like ARA codes they have projected graph or protograph representations; these characteristics make it possible to design high-speed iterative decoders that utilize belief-propagation algorithms. The objective in proposing ARAA codes as a subclass of ARA codes was to enhance the error-floor performance of ARA codes while maintaining simple encoding structures and low maximum variable node degree.

  11. Multichannel error correction code decoder

    NASA Technical Reports Server (NTRS)

    Wagner, Paul K.; Ivancic, William D.

    1993-01-01

    A brief overview of a processing satellite for a mesh very-small-aperture (VSAT) communications network is provided. The multichannel error correction code (ECC) decoder system, the uplink signal generation and link simulation equipment, and the time-shared decoder are described. The testing is discussed. Applications of the time-shared decoder are recommended.

  12. Corrections to the Vienna Code

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Since the publication of the Vienna Code, several errors have been noticed. Most are minor punctuation or cross-referencing errors, or, in the Appendices, inconsistencies in abbreviation, but there was one important omission from Art. 37, the misspelling of two specific epithets and the transpositio...

  13. GOES satellite time code dissemination

    NASA Technical Reports Server (NTRS)

    Beehler, R. E.

    1983-01-01

    The GOES time code system, the performance achieved to date, and some potential improvements in the future are discussed. The disseminated time code is originated from a triply redundant set of atomic standards, time code generators and related equipment maintained by NBS at NOAA's Wallops Island, VA satellite control facility. It is relayed by two GOES satellites located at 75 W and 135 W longitude on a continuous basis to users within North and South America (with overlapping coverage) and well out into the Atlantic and Pacific ocean areas. Downlink frequencies are near 468 MHz. The signals from both satellites are monitored and controlled from the NBS labs at Boulder, CO with additional monitoring input from geographically separated receivers in Washington, D.C. and Hawaii. Performance experience with the received time codes for periods ranging from several years to one day is discussed. Results are also presented for simultaneous, common-view reception by co-located receivers and by receivers separated by several thousand kilometers.

  14. Testing of the CONTAIN code

    SciTech Connect

    Sciacca, F.W.; Bergeron, K.D.; Murata, K.K.; Rexroth, P.E.

    1984-04-01

    CONTAIN is a large computer code intended for use in the analysis of severe nuclear power plant accidents. Many tests have been conducted on CONTAIN to assess its adequacy for dealing with nuclear-accident problems. This report describes the CONTAIN test program and summarizes the results obtained to date. These results are presented so that users may be aware of the features of CONTAIN that have been checked and of the areas where problems have been identified. In addition, this report provides information needed by users to repeat tests of interest in their specific work areas. The test efforts have identified a substantial number of problems in the coding or logic of the CONTAIN code. Most of these problems have been corrected. These corrections have been included in the most recent versions of the code. CONTAIN can accurately treat most of the phenomena expected to occur in containment atmospheres. Some problems identified by the test program, involving pool-related phenomena, have prompted the development of a substantially new system of models for pool phenomena. When completed, this new system will be subjected to intense testing of the type described here.

  15. Consulting Teacher Code of Ethics.

    ERIC Educational Resources Information Center

    Alnasrawi, Susan C.; Gill, Janet A.

    1981-01-01

    The code of ethics for consulting teachers, developed by Vermont's Consulting Teacher Program (a model for training special educators to act as consultants to regular educators) addresses qualifications, procedures, and the consulting teacher's relationship with students, parents, and other staff. (CL)

  16. Three-dimensional stellarator codes

    PubMed Central

    Garabedian, P. R.

    2002-01-01

    Three-dimensional computer codes have been used to develop quasisymmetric stellarators with modular coils that are promising candidates for a magnetic fusion reactor. The mathematics of plasma confinement raises serious questions about the numerical calculations. Convergence studies have been performed to assess the best configurations. Comparisons with recent data from large stellarator experiments serve to validate the theory. PMID:12140367

  17. Multiplier Architecture for Coding Circuits

    NASA Technical Reports Server (NTRS)

    Wang, C. C.; Truong, T. K.; Shao, H. M.; Deutsch, L. J.

    1986-01-01

    Multipliers based on new algorithm for Galois-field (GF) arithmetic regular and expandable. Pipeline structures used for computing both multiplications and inverses. Designs suitable for implementation in very-large-scale integrated (VLSI) circuits. This general type of inverter and multiplier architecture especially useful in performing finite-field arithmetic of Reed-Solomon error-correcting codes and of some cryptographic algorithms.

  18. QR Codes: Taking Collections Further

    ERIC Educational Resources Information Center

    Ahearn, Caitlin

    2014-01-01

    With some thought and direction, QR (quick response) codes are a great tool to use in school libraries to enhance access to information. From March through April 2013, Caitlin Ahearn interned at Sanborn Regional High School (SRHS) under the supervision of Pam Harland. As a result of Harland's un-Deweying of the nonfiction collection at SRHS,…

  19. Overview of CODE V development

    NASA Astrophysics Data System (ADS)

    Harris, Thomas I.

    1991-01-01

    This paper is part of a session that is aimed at briefly describing some of today''s optical design software packages with emphasis on the program''s philosophy and technology. CODE V is the ongoing result of a development process that began in the 1960''s it is now the result of many people''s efforts. This paper summarizes the roots of the program some of its history dominant philosophies and technologies that have contributed to its usefulness and some that drive its continued development. ROOTS OF CODE V Conceived in the early 60''s This was at a time when there was skepticism that " automatic design" could design lenses equal or better than " hand" methods. The concepts underlying CODE V and its predecessors were based on ten years of experience and exposure to the problems of a group of lens designers in a design-for-manufacture environment. The basic challenge was to show that lens design could be done better easier and faster by high quality computer-assisted design tools. The earliest development was for our own use as an engineering services organization -an in-house tool for custom design. As a tool it had to make us efficient in providing lens design and engineering services as a self-sustaining business. PHILOSOPHY OF OVTIM!ZATION IN CODE V Error function formation Based on experience as a designer we felt very strongly that there should be a clear separation of

  20. The NESSUS finite element code

    NASA Technical Reports Server (NTRS)

    Dias, J. B.; Nagiegaal, J. C.; Nakazawa, S.

    1987-01-01

    The objective of this development is to provide a new analysis tool which integrates the structural modeling versatility of a modern finite element code with the latest advances in the area of probabilistic modeling and structural reliability. Version 2.0 of the NESSUS finite element code was released last February, and is currently being exercised on a set of problems which are representative of typical Space Shuttle Main Engine (SSME) applications. NESSUS 2.0 allows linear elastostatic and eigenvalue analysis of structures with uncertain geometry, material properties and boundary conditions, which are subjected to a random mechanical and thermal loading environment. The NESSUS finite element code is a key component in a broader software system consisting of five major modules. NESSUS/EXPERT is an expert system under development at Southwest Research Institute, with the objective of centralizing all component-specific knowledge useful for conducting probabilistic analysis of typical Space Shuttle Main Engine (SSME) components. NESSUS/FEM contains the finite element code used for the structural analysis and parameter sensitivity evaluation of these components. The task of parametrizing a finite element mesh in terms of the random variables present is facilitated with the use of the probabilistic data preprocessor in NESSUS/PRE. An external database file is used for managing the bulk of the data generated by NESSUS/FEM.

  1. Generating Constant Weight Binary Codes

    ERIC Educational Resources Information Center

    Knight, D.G.

    2008-01-01

    The determination of bounds for A(n, d, w), the maximum possible number of binary vectors of length n, weight w, and pairwise Hamming distance no less than d, is a classic problem in coding theory. Such sets of vectors have many applications. A description is given of how the problem can be used in a first-year undergraduate computational…

  2. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  3. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1984-01-01

    FORTRAN Static Source Code Analyzer program, SAP (DEC VAX version), automatically gathers statistics on occurrences of statements and structures within FORTRAN program and provides reports of those statistics. Provisions made for weighting each statistic and provide an overall figure of complexity.

  4. FORTRAN Static Source Code Analyzer

    NASA Technical Reports Server (NTRS)

    Merwarth, P.

    1982-01-01

    FORTRAN Static Source Code Analyzer program (SAP) automatically gathers and reports statistics on occurrences of statements and structures within FORTRAN program. Provisions are made for weighting each statistic, providing user with overall figure of complexity. Statistics, as well as figures of complexity, are gathered on module-by-module basis. Overall summed statistics are accumulated for complete input source file.

  5. AEDS Property Classification Code Manual.

    ERIC Educational Resources Information Center

    Association for Educational Data Systems, Washington, DC.

    The control and inventory of property items using data processing machines requires a form of numerical description or code which will allow a maximum of description in a minimum of space on the data card. An adaptation of a standard industrial classification system is given to cover any expendable warehouse item or non-expendable piece of…

  6. Authentication codes that permit arbitration

    SciTech Connect

    Simmons, G.J.

    1987-01-01

    Objective of authentication is to detect attempted deceptions in a communications channel. Traditionally this has been restricted to providing the authorized receiver with a capability of detecting unauthentic messages. The known codes have all left open the possibility for either the transmitter to disavow a message that he actually sent to the receiver, i.e., an authentic message, or else for the receiver to falsely attribute a message of his own devising to the transmitter. Of course the party being deceived would know that he was the victim of a deception by the other, but would be unable to ''prove'' this to a third party. Ideally, authentication should provide a means to detect attempted deceptions by insiders (the transmitter or receiver) as well as outsiders (the opponent). It has been an open question of whether it was possible to devise authentication codes that would permit a third party, an arbiter, to decide (in probability) whether the transmitter or the receiver was cheating in the event of a dispute. We answer this question in that both permits the receiver to detect outsider deceptions, as well affirmative by first constructing an example of an authentication code as permitting a designated arbiter to detect insider deceptions and then by generalizing this construction to an infinite class of such codes.

  7. Tri-Coding of Information.

    ERIC Educational Resources Information Center

    Simpson, Timothy J.

    Paivio's Dual Coding Theory has received widespread recognition for its connection between visual and aural channels of internal information processing. The use of only two channels, however, cannot satisfactorily explain the effects witnessed every day. This paper presents a study suggesting the presence a third, kinesthetic channel, currently…

  8. Dress Codes and Gang Activity.

    ERIC Educational Resources Information Center

    Gluckman, Ivan B.

    1996-01-01

    Concern with school violence and efforts to reduce gang visibility at school have led to controversy about students' constitutional rights to freedom of expression. This document outlines legal precedents and offers guidelines for developing a sound school policy on dress codes. It answers the following questions: (1) Are gang clothing and symbols…

  9. Final Technical Report: Hydrogen Codes and Standards Outreach

    SciTech Connect

    Hall, Karen I.

    2007-05-12

    This project contributed significantly to the development of new codes and standards, both domestically and internationally. The NHA collaborated with codes and standards development organizations to identify technical areas of expertise that would be required to produce the codes and standards that industry and DOE felt were required to facilitate commercialization of hydrogen and fuel cell technologies and infrastructure. NHA staff participated directly in technical committees and working groups where issues could be discussed with the appropriate industry groups. In other cases, the NHA recommended specific industry experts to serve on technical committees and working groups where the need for this specific industry expertise would be on-going, and where this approach was likely to contribute to timely completion of the effort. The project also facilitated dialog between codes and standards development organizations, hydrogen and fuel cell experts, the government and national labs, researchers, code officials, industry associations, as well as the public regarding the timeframes for needed codes and standards, industry consensus on technical issues, procedures for implementing changes, and general principles of hydrogen safety. The project facilitated hands-on learning, as participants in several NHA workshops and technical meetings were able to experience hydrogen vehicles, witness hydrogen refueling demonstrations, see metal hydride storage cartridges in operation, and view other hydrogen energy products.

  10. Soft decision decoding of block codes

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.; Mceliece, R. J.

    1978-01-01

    Using a general decoding technique of Solomon we evaluate the performance of certain block codes on a Gaussian channel. Quadratic residue codes of lengths 48 and 80 as well as BCH codes of length 128 and rates 1/2 and 1/3 are considered. All four of these codes perform quite favorably with respect to the constraint-length 7 rate 1/2 convolutional code presently used on NASA's Mariner-class spacecraft.

  11. Reed-solomon Code Synchronization Revisited

    NASA Technical Reports Server (NTRS)

    Deutsch, L. J.

    1985-01-01

    A concatenated coding consisting of an inner (7, 1/2) convolutional code and an outer (255, 223) Reed-Solomon code was recommended by the Consultative Committee for Space Data Systems for cross-supported space missions. The Reed-Solomon code that was chosen makes use of the Berlekamp encoding algorithm. Some peculiarities of this code that could give rise to synchronization problems are examined. Suggestions are given to alleviate these problems.

  12. Fluid Film Bearing Code Development

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The next generation of rocket engine turbopumps is being developed by industry through Government-directed contracts. These turbopumps will use fluid film bearings because they eliminate the life and shaft-speed limitations of rolling-element bearings, increase turbopump design flexibility, and reduce the need for turbopump overhauls and maintenance. The design of the fluid film bearings for these turbopumps, however, requires sophisticated analysis tools to model the complex physical behavior characteristic of fluid film bearings operating at high speeds with low viscosity fluids. State-of-the-art analysis and design tools are being developed at the Texas A&M University under a grant guided by the NASA Lewis Research Center. The latest version of the code, HYDROFLEXT, is a thermohydrodynamic bulk flow analysis with fluid compressibility, full inertia, and fully developed turbulence models. It can predict the static and dynamic force response of rigid and flexible pad hydrodynamic bearings and of rigid and tilting pad hydrostatic bearings. The Texas A&M code is a comprehensive analysis tool, incorporating key fluid phenomenon pertinent to bearings that operate at high speeds with low-viscosity fluids typical of those used in rocket engine turbopumps. Specifically, the energy equation was implemented into the code to enable fluid properties to vary with temperature and pressure. This is particularly important for cryogenic fluids because their properties are sensitive to temperature as well as pressure. As shown in the figure, predicted bearing mass flow rates vary significantly depending on the fluid model used. Because cryogens are semicompressible fluids and the bearing dynamic characteristics are highly sensitive to fluid compressibility, fluid compressibility effects are also modeled. The code contains fluid properties for liquid hydrogen, liquid oxygen, and liquid nitrogen as well as for water and air. Other fluids can be handled by the code provided that the

  13. The Da Vinci code dynamically de-coded.

    PubMed

    Cohen, Mariam

    2005-01-01

    The novel The Da Vinci Code, by Dan Brown has been on best-seller lists for over two years. An examination of Brown's previous novels reveals a well-designed plot line shared by all four novels that not only makes them good "thrillers" but also creates a mythological structure to the novels that draws on common unconscious fantasies in the same way that fairy tales do. One aspect of this mythological structure is the use of evil conspiracies (and benign ones as well) for the protagonist to overcome. In addition, The Da Vinci Code presents a religious theme involving legends about Mary Magdalene. This theme touches on the role of a feminine aspect to divinity in allowing for an erotic connection with the divine. PMID:16448349

  14. The neuronal code(s) of the cerebellum.

    PubMed

    Heck, Detlef H; De Zeeuw, Chris I; Jaeger, Dieter; Khodakhah, Kamran; Person, Abigail L

    2013-11-01

    Understanding how neurons encode information in sequences of action potentials is of fundamental importance to neuroscience. The cerebellum is widely recognized for its involvement in the coordination of movements, which requires muscle activation patterns to be controlled with millisecond precision. Understanding how cerebellar neurons accomplish such high temporal precision is critical to understanding cerebellar function. Inhibitory Purkinje cells, the only output neurons of the cerebellar cortex, and their postsynaptic target neurons in the cerebellar nuclei, fire action potentials at high, sustained frequencies, suggesting spike rate modulation as a possible code. Yet, millisecond precise spatiotemporal spike activity patterns in Purkinje cells and inferior olivary neurons have also been observed. These results and ongoing studies suggest that the neuronal code used by cerebellar neurons may span a wide time scale from millisecond precision to slow rate modulations, likely depending on the behavioral context. PMID:24198351

  15. The chromatin regulatory code: Beyond a histone code

    NASA Astrophysics Data System (ADS)

    Lesne, A.

    2006-03-01

    In this commentary on the contribution by Arndt Benecke in this issue, I discuss why the notion of “chromatin code” introduced and elaborated in this paper is to be preferred to that of “histone code”. Speaking of a code as regards nucleosome conformation and histone tail post-translational modifications only makes sense within the chromatin fiber, where their physico-chemical features can be translated into regulatory programs at the genome level, by means of a complex, multi-level interplay with the fiber architecture and dynamics settled in the course of Evolution. In particular, this chromatin code presumably exploits allosteric transitions of the chromatin fiber. The chromatin structure dependence of its translation suggests two alternative modes of transcription initiation regulation, also proposed in the paper by A. Benecke in this issue for interpreting strikingly bimodal micro-array data.

  16. HETC radiation transport code development for cosmic ray shielding applications in space.

    PubMed

    Townsend, L W; Miller, T M; Gabriel, Tony A

    2005-01-01

    In order to facilitate three-dimensional analyses of space radiation shielding scenarios for future space missions, the Monte Carlo radiation transport code HETC is being extended to include transport of energetic heavy ions, such as are found in the galactic cosmic ray spectrum in space. Recently, an event generator capable of providing nuclear interaction data for use in HETC was developed and incorporated into the code. The event generator predicts the interaction product yields and production angles and energies using nuclear models and Monte Carlo techniques. Testing and validation of the extended transport code has begun. In this work, the current status of code modifications, which enable energetic heavy ions and their nuclear reaction products to be transported through thick shielding, are described. Also, initial results of code testing against available laboratory beam data for energetic heavy ions interacting in thick targets are presented. PMID:16604614

  17. Modeling Guidelines for Code Generation in the Railway Signaling Context

    NASA Technical Reports Server (NTRS)

    Ferrari, Alessio; Bacherini, Stefano; Fantechi, Alessandro; Zingoni, Niccolo

    2009-01-01

    Modeling guidelines constitute one of the fundamental cornerstones for Model Based Development. Their relevance is essential when dealing with code generation in the safety-critical domain. This article presents the experience of a railway signaling systems manufacturer on this issue. Introduction of Model-Based Development (MBD) and code generation in the industrial safety-critical sector created a crucial paradigm shift in the development process of dependable systems. While traditional software development focuses on the code, with MBD practices the focus shifts to model abstractions. The change has fundamental implications for safety-critical systems, which still need to guarantee a high degree of confidence also at code level. Usage of the Simulink/Stateflow platform for modeling, which is a de facto standard in control software development, does not ensure by itself production of high-quality dependable code. This issue has been addressed by companies through the definition of modeling rules imposing restrictions on the usage of design tools components, in order to enable production of qualified code. The MAAB Control Algorithm Modeling Guidelines (MathWorks Automotive Advisory Board)[3] is a well established set of publicly available rules for modeling with Simulink/Stateflow. This set of recommendations has been developed by a group of OEMs and suppliers of the automotive sector with the objective of enforcing and easing the usage of the MathWorks tools within the automotive industry. The guidelines have been published in 2001 and afterwords revisited in 2007 in order to integrate some additional rules developed by the Japanese division of MAAB [5]. The scope of the current edition of the guidelines ranges from model maintainability and readability to code generation issues. The rules are conceived as a reference baseline and therefore they need to be tailored to comply with the characteristics of each industrial context. Customization of these

  18. Amino acid codes in mitochondria as possible clues to primitive codes

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1981-01-01

    Differences between mitochondrial codes and the universal code indicate that an evolutionary simplification has taken place, rather than a return to a more primitive code. However, these differences make it evident that the universal code is not the only code possible, and therefore earlier codes may have differed markedly from the previous code. The present universal code is probably a 'frozen accident.' The change in CUN codons from leucine to threonine (Neurospora vs. yeast mitochondria) indicates that neutral or near-neutral changes occurred in the corresponding proteins when this code change took place, caused presumably by a mutation in a tRNA gene.

  19. Quantum generalized Reed-Solomon codes: Unified framework for quantum maximum-distance-separable codes

    NASA Astrophysics Data System (ADS)

    Li, Zhuo; Xing, Li-Juan; Wang, Xin-Mei

    2008-01-01

    We construct a family of quantum maximum-distance-separable (MDS) codes from classical generalized Reed-Solomon codes and derive the necessary and sufficient condition under which these quantum codes exist. We also give code bounds and show how to construct them analytically. We find that existing quantum MDS codes can be unified under these codes in the sense that when a quantum MDS code exists, then a quantum code of this type with the same parameters also exists. Thus, as far as is known at present, they are the most important family of quantum MDS codes.

  20. Genetic coding and gene expression - new Quadruplet genetic coding model

    NASA Astrophysics Data System (ADS)

    Shankar Singh, Rama

    2012-07-01

    Successful demonstration of human genome project has opened the door not only for developing personalized medicine and cure for genetic diseases, but it may also answer the complex and difficult question of the origin of life. It may lead to making 21st century, a century of Biological Sciences as well. Based on the central dogma of Biology, genetic codons in conjunction with tRNA play a key role in translating the RNA bases forming sequence of amino acids leading to a synthesized protein. This is the most critical step in synthesizing the right protein needed for personalized medicine and curing genetic diseases. So far, only triplet codons involving three bases of RNA, transcribed from DNA bases, have been used. Since this approach has several inconsistencies and limitations, even the promise of personalized medicine has not been realized. The new Quadruplet genetic coding model proposed and developed here involves all four RNA bases which in conjunction with tRNA will synthesize the right protein. The transcription and translation process used will be the same, but the Quadruplet codons will help overcome most of the inconsistencies and limitations of the triplet codes. Details of this new Quadruplet genetic coding model and its subsequent potential applications including relevance to the origin of life will be presented.

  1. Coding and transmission of subband coded images on the Internet

    NASA Astrophysics Data System (ADS)

    Wah, Benjamin W.; Su, Xiao

    2001-09-01

    Subband-coded images can be transmitted in the Internet using either the TCP or the UDP protocol. Delivery by TCP gives superior decoding quality but with very long delays when the network is unreliable, whereas delivery by UDP has negligible delays but with degraded quality when packets are lost. Although images are delivered currently over the Internet by TCP, we study in this paper the use of UDP to deliver multi-description reconstruction-based subband-coded images. First, in order to facilitate recovery from UDP packet losses, we propose a joint sender-receiver approach for designing optimized reconstruction-based subband transform (ORB-ST) in multi-description coding (MDC). Second, we carefully evaluate the delay-quality trade-offs between the TCP delivery of SDC images and the UDP and combined TCP/UDP delivery of MDC images. Experimental results show that our proposed ORB-ST performs well in real Internet tests, and UDP and combined TCP/UDP delivery of MDC images provide a range of attractive alternatives to TCP delivery.

  2. Box codes of lengths 48 and 72

    NASA Technical Reports Server (NTRS)

    Solomon, G.; Jin, Y.

    1993-01-01

    A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.

  3. Box codes of lengths 48 and 72

    NASA Astrophysics Data System (ADS)

    Solomon, G.; Jin, Y.

    1993-11-01

    A self-dual code length 48, dimension 24, with Hamming distance essentially equal to 12 is constructed here. There are only six code words of weight eight. All the other code words have weights that are multiples of four and have a minimum weight equal to 12. This code may be encoded systematically and arises from a strict binary representation of the (8,4;5) Reed-Solomon (RS) code over GF (64). The code may be considered as six interrelated (8,7;2) codes. The Mattson-Solomon representation of the cyclic decomposition of these codes and their parity sums are used to detect an odd number of errors in any of the six codes. These may then be used in a correction algorithm for hard or soft decision decoding. A (72,36;15) box code was constructed from a (63,35;8) cyclic code. The theoretical justification is presented herein. A second (72,36;15) code is constructed from an inner (63,27;16) Bose Chaudhuri Hocquenghem (BCH) code and expanded to length 72 using box code algorithms for extension. This code was simulated and verified to have a minimum distance of 15 with even weight words congruent to zero modulo four. The decoding for hard and soft decision is still more complex than the first code constructed above. Finally, an (8,4;5) RS code over GF (512) in the binary representation of the (72,36;15) box code gives rise to a (72,36;16*) code with nine words of weight eight, and all the rest have weights greater than or equal to 16.

  4. The Mystery Behind the Code: Differentiated Instruction with Quick Response Codes in Secondary Physical Education

    ERIC Educational Resources Information Center

    Adkins, Megan; Wajciechowski, Misti R.; Scantling, Ed

    2013-01-01

    Quick response codes, better known as QR codes, are small barcodes scanned to receive information about a specific topic. This article explains QR code technology and the utility of QR codes in the delivery of physical education instruction. Consideration is given to how QR codes can be used to accommodate learners of varying ability levels as…

  5. Optical code division multiplexed fiber Bragg grating sensing networks

    NASA Astrophysics Data System (ADS)

    Triana, Cristian; Varón, Margarita; Pastor, Daniel

    2015-09-01

    We present the application of Optical Code Division Multiplexing (OCDM) techniques in order to enhance the spectral operation and detection capability of fiber Bragg grating (FBG) sensors networks even under overlapping conditions. In this paper, Optical Orthogonal Codes (OOC) are used to design FBG sensors composed of more than one reflection band. Simulation of the interaction between the encoded Gaussian-shaped sensors is presented. Signal decoding is performed in the electrical domain without requiring additional optical components by means of the autocorrelation product between the reflected spectrum and each sensor-codeword. Results illustrate the accuracy and distinction capability of the method.

  6. CODE's multi-GNSS orbit and clock solution - status 2016

    NASA Astrophysics Data System (ADS)

    Prange, Lars; Orliac, Etienne; Dach, Rolf; Arnold, Daniel; Beutler, Gerhard; Schaer, Stefan; Jäggi, Adrian

    2016-04-01

    The Center for Orbit Determination in Europe (CODE) is contributing as a global analysis center to the International GNSS Service (IGS). Since 2012 CODE also contributes to the Multi-GNSS-EXperiment (MGEX) of the IGS. The list of satellite systems included in the CODE MGEX (COM) orbit and clock solution has been extended step-by-step in recent years. Today, it includes five satellite systems, namely GPS, GLONASS, Galileo, BeiDou, and QZSS. The COM orbit and clock products are regularly updated at the IGS MGEX products directory of the CDDIS data center and at the ftp server of the AIUB. CODE's experimental MGEX solution is subject to frequent updates and improvements. The introduction of an improved solar radiation pressure (SRP) model in early 2015 significantly improved the orbits and clock corrections of satellites with elongated bodies (in particular GLONASS, Galileo, and QZSS) as long as the satellite's attitude is maintained by yaw-steering. Currently we focus on improving the orbits of QZSS and BeiDou satellites, while moving in the orbit normal mode. The COM orbits are validated by computing orbit misclosures at the day boundaries and by SLR residuals. The COM clocks are validated using the Allan deviations and linear fits through the time series of epoch-wise clock corrections. We present the current status of the COM products and the validation results.

  7. Biological Information Transfer Beyond the Genetic Code: The Sugar Code

    NASA Astrophysics Data System (ADS)

    Gabius, H.-J.

    In the era of genetic engineering, cloning, and genome sequencing the focus of research on the genetic code has received an even further accentuation in the public eye. In attempting, however, to understand intra- and intercellular recognition processes comprehensively, the two biochemical dimensions established by nucleic acids and proteins are not sufficient to satisfactorily explain all molecular events in, for example, cell adhesion or routing. The consideration of further code systems is essential to bridge this gap. A third biochemical alphabet forming code words with an information storage capacity second to no other substance class in rather small units (words, sentences) is established by monosaccharides (letters). As hardware oligosaccharides surpass peptides by more than seven orders of magnitude in the theoretical ability to build isomers, when the total of conceivable hexamers is calculated. In addition to the sequence complexity, the use of magnetic resonance spectroscopy and molecular modeling has been instrumental in discovering that even small glycans can often reside in not only one but several distinct low-energy conformations (keys). Intriguingly, conformers can display notably different capacities to fit snugly into the binding site of nonhomologous receptors (locks). This process, experimentally verified for two classes of lectins, is termed "differential conformer selection." It adds potential for shifts of the conformer equilibrium to modulate ligand properties dynamically and reversibly to the well-known changes in sequence (including anomeric positioning and linkage points) and in pattern of substitution, for example, by sulfation. In the intimate interplay with sugar receptors (lectins, enzymes, and antibodies) the message of coding units of the sugar code is deciphered. Their recognition will trigger postbinding signaling and the intended biological response. Knowledge about the driving forces for the molecular rendezvous, i

  8. Maximal dinucleotide and trinucleotide circular codes.

    PubMed

    Michel, Christian J; Pellegrini, Marco; Pirillo, Giuseppe

    2016-01-21

    We determine here the number and the list of maximal dinucleotide and trinucleotide circular codes. We prove that there is no maximal dinucleotide circular code having strictly less than 6 elements (maximum size of dinucleotide circular codes). On the other hand, a computer calculus shows that there are maximal trinucleotide circular codes with less than 20 elements (maximum size of trinucleotide circular codes). More precisely, there are maximal trinucleotide circular codes with 14, 15, 16, 17, 18 and 19 elements and no maximal trinucleotide circular code having less than 14 elements. We give the same information for the maximal self-complementary dinucleotide and trinucleotide circular codes. The amino acid distribution of maximal trinucleotide circular codes is also determined. PMID:26382231

  9. Modulation transfer function of bar code scanning

    NASA Astrophysics Data System (ADS)

    Tang, Hong; Milster, Tom D.

    1998-09-01

    Bar code scanners are ubiquitous in supermarkets. As a bar code is passed over a scanner, a laser beam scans across the bar code. The scattered light is modulated by the reflectivity of the bars and spaces of the bar code. The bar code scanning process can be described as a 1D convolution of the scanning laser profile and the bar code reflectivity function. The modulation transfer function (MTF) of bar code scanning is the Fourier transform of the marginal profile of the laser beam. The properties of the MTF of bar code scanning is similar to that of an incoherent imaging system. Measurements of the MTF of bar code scanning at one focus position are presented. The experimental results are then discussed.

  10. 76 FR 12847 - Change of Address; Requests for Exemption From the Bar Code Label Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-09

    ... From the Bar Code Label Requirements AGENCY: Food and Drug Administration, HHS. ACTION: Final rule...)(2) to read as follows: Sec. 201.25 Bar code label requirements. * * * * * (d) * * * (2) Requests for... (requests involving a drug product) or to the Office of Compliance and Biologics Quality (HFM-600),...

  11. Enhanced motion coding in MC-EZBC

    NASA Astrophysics Data System (ADS)

    Chen, Junhua; Zhang, Wenjun; Wang, Yingkun

    2005-07-01

    Since hierarchical variable size block matching and bidirectional motion compensation are used in the motioncompensated embedded zero block coding (MC-EZBC), the motion information consists of motion vector quadtree map and motion vectors. In the conventional motion coding scheme, the quadtree structure is coded directly, the motion vector modes are coded with Huffman codes, and the motion vector differences are coded by an m-ary arithmetic coder with 0-order models. In this paper we propose a new motion coding scheme which uses an extension of the CABAC algorithm and new context modeling for quadtree structure coding and mode coding. In addition, we use a new scalable motion coding method which scales the motion vector quadtrees according to the rate-distortion slope of the tree nodes. Experimental results show that the new coding scheme increases the efficiency of the motion coding by more than 25%. The performance of the system is improved accordingly, especially in low bit rates. Moreover, with the scalable motion coding, the subjective and objective coding performance is further enhanced in low bit rate scenarios.

  12. BWR Core Heat Transfer Code System.

    Energy Science and Technology Software Center (ESTSC)

    1999-04-27

    Version 00 MOXY is used for the thermal analysis of a planar section of a boiling water reactor (BWR) fuel element during a loss-of-coolant accident (LOCA). The code emplyoys models that describe heat transfer by conduction, convection, and thermal radiation, and heat generation by metal-water reaction and fission product decay. Models are included for considering fuel-rod swelling and rupture, energy transport across the fuel-to-cladding gap, and the thermal response of the canister. MOXY requires thatmore » time-dependent data during the blowdown process for the power normalized to the steady-state power, for the heat-transfer coefficient, and for the fluid temperature be provided as input. Internal models provide these parameters during the heatup and emergency cooling phases.« less

  13. MORSE Monte Carlo radiation transport code system

    SciTech Connect

    Emmett, M.B.

    1983-02-01

    This report is an addendum to the MORSE report, ORNL-4972, originally published in 1975. This addendum contains descriptions of several modifications to the MORSE Monte Carlo Code, replacement pages containing corrections, Part II of the report which was previously unpublished, and a new Table of Contents. The modifications include a Klein Nishina estimator for gamma rays. Use of such an estimator required changing the cross section routines to process pair production and Compton scattering cross sections directly from ENDF tapes and writing a new version of subroutine RELCOL. Another modification is the use of free form input for the SAMBO analysis data. This required changing subroutines SCORIN and adding new subroutine RFRE. References are updated, and errors in the original report have been corrected. (WHK)

  14. Experience with advanced nodal codes at YAEC

    SciTech Connect

    Cacciapouti, R.J.

    1990-01-01

    Yankee Atomic Electric Company (YAEC) has been performing reload licensing analysis since 1969. The basic pressurized water reactor (PWR) methodology involves the use of LEOPARD for cross-section generation, PDQ for radial power distributions and integral control rod worth, and SIMULATE for axial power distributions and differential control rod worth. In 1980, YAEC began performing reload licensing analysis for the Vermont Yankee boiling water reactor (BWR). The basic BWR methodology involves the use of CASMO for cross-section generation and SIMULATE for three-dimensional power distributions. In 1986, YAEC began investigating the use of CASMO-3 for cross-section generation and the advanced nodal code SIMULATE-3 for power distribution analysis. Based on the evaluation, the CASMO-3/SIMULATE-3 methodology satisfied all requirements. After careful consideration, the cost of implementing the new methodology is expected to be offset by reduced computing costs, improved engineering productivity, and fuel-cycle performance gains.

  15. A large scale code resolution service network in the Internet of Things.

    PubMed

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-01-01

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT’s advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207

  16. A Large Scale Code Resolution Service Network in the Internet of Things

    PubMed Central

    Yu, Haining; Zhang, Hongli; Fang, Binxing; Yu, Xiangzhan

    2012-01-01

    In the Internet of Things a code resolution service provides a discovery mechanism for a requester to obtain the information resources associated with a particular product code immediately. In large scale application scenarios a code resolution service faces some serious issues involving heterogeneity, big data and data ownership. A code resolution service network is required to address these issues. Firstly, a list of requirements for the network architecture and code resolution services is proposed. Secondly, in order to eliminate code resolution conflicts and code resolution overloads, a code structure is presented to create a uniform namespace for code resolution records. Thirdly, we propose a loosely coupled distributed network consisting of heterogeneous, independent; collaborating code resolution services and a SkipNet based code resolution service named SkipNet-OCRS, which not only inherits DHT's advantages, but also supports administrative control and autonomy. For the external behaviors of SkipNet-OCRS, a novel external behavior mode named QRRA mode is proposed to enhance security and reduce requester complexity. For the internal behaviors of SkipNet-OCRS, an improved query algorithm is proposed to increase query efficiency. It is analyzed that integrating SkipNet-OCRS into our resolution service network can meet our proposed requirements. Finally, simulation experiments verify the excellent performance of SkipNet-OCRS. PMID:23202207

  17. Overview of HZETRN and BRNTRN Space Radiation Shielding Codes

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Cucinotta, F. A.; Shinn, J. L.; Simonsen, L. C.; Badavi, F. F.

    1997-01-01

    The NASA Radiation Health Program has supported basic research over the last decade in radiation physics to develop ionizing radiation transport codes and corresponding data bases for the protection of astronauts from galactic and solar cosmic rays on future deep space missions. The codes describe the interactions of the incident radiations with shield materials where their content is modified by the atomic and nuclear reactions through which high energy heavy ions are fragmented into less massive reaction products and reaction products are produced as radiations as direct knockout of shield constituents or produced as de-excitation products in the reactions. This defines the radiation fields to which specific devices are subjected onboard a spacecraft. Similar reactions occur in the device itself which is the initiating event for the device response. An overview of the computational procedures and data base with some applications to photonic and data processing devices will be given.

  18. Visual analysis of code security

    SciTech Connect

    Goodall, John R; Radwan, Hassan; Halseth, Lenny

    2010-01-01

    To help increase the confidence that software is secure, researchers and vendors have developed different kinds of automated software security analysis tools. These tools analyze software for weaknesses and vulnerabilities, but the individual tools catch different vulnerabilities and produce voluminous data with many false positives. This paper describes a system that brings together the results of disparate software analysis tools into a visual environment to support the triage and exploration of code vulnerabilities. Our system allows software developers to explore vulnerability results to uncover hidden trends, triage the most important code weaknesses, and show who is responsible for introducing software vulnerabilities. By correlating and normalizing multiple software analysis tools' data, the overall vulnerability detection coverage of software is increased. A visual overview and powerful interaction allows the user to focus attention on the most pressing vulnerabilities within huge volumes of data, and streamlines the secure software development workflow through integration with development tools.

  19. GeoPhysical Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2011-05-21

    GPAC is a code that integrates open source libraries for element formulations, linear algebra, and I/O with two main LLNL-Written components: (i) a set of standard finite elements physics solvers for rersolving Darcy fluid flow, explicit mechanics, implicit mechanics, and fluid-mediated fracturing, including resolution of contact both implicity and explicity, and (ii) a MPI-based parallelization implementation for use on generic HPC distributed memory architectures. The resultant code can be used alone for linearly elastic problemsmore » and problems involving hydraulic fracturing, where the mesh topology is dynamically changed. The key application domain is for low-rate stimulation and fracture control in subsurface reservoirs (e.g., enhanced geothermal sites and unconventional shale gas stimulation). GPAC also has interfaces to call external libraries for, e.g., material models and equations of state; however, LLNL-developed EOS and material models will not be part of the current release.« less

  20. Dopamine reward prediction error coding

    PubMed Central

    Schultz, Wolfram

    2016-01-01

    Reward prediction errors consist of the differences between received and predicted rewards. They are crucial for basic forms of learning about rewards and make us strive for more rewards—an evolutionary beneficial trait. Most dopamine neurons in the midbrain of humans, monkeys, and rodents signal a reward prediction error; they are activated by more reward than predicted (positive prediction error), remain at baseline activity for fully predicted rewards, and show depressed activity with less reward than predicted (negative prediction error). The dopamine signal increases nonlinearly with reward value and codes formal economic utility. Drugs of addiction generate, hijack, and amplify the dopamine reward signal and induce exaggerated, uncontrolled dopamine effects on neuronal plasticity. The striatum, amygdala, and frontal cortex also show reward prediction error coding, but only in subpopulations of neurons. Thus, the important concept of reward prediction errors is implemented in neuronal hardware. PMID:27069377