Sample records for investigators independently coded

  1. Independent peer review of nuclear safety computer codes

    SciTech Connect

    Boyack, B.E.; Jenks, R.P.

    1993-01-01

    A structured process of independent computer code peer review has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper focuses on the process that evolved during recent reviews of NRC codes.

  2. Independent peer review of nuclear safety computer codes

    SciTech Connect

    Boyack, B.E.; Jenks, R.P.

    1993-02-01

    A structured process of independent computer code peer review has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper focuses on the process that evolved during recent reviews of NRC codes.

  3. Independent accident investigation: a modern safety tool.

    PubMed

    Stoop, John A

    2004-07-26

    Historically, safety has been subjected to a fragmented approach. In the past, every department has had its own responsibility towards safety, focusing either on working conditions, internal safety, external safety, rescue and emergency, public order or security. They each issued policy documents, which in their time were leading statements for elaboration and regulation. They also addressed safety issues with tools of various nature, often specifically developed within their domain. Due to a series of major accidents and disasters, the focus of attention is shifting from complying with quantitative risk standards towards intervention in primary operational processes, coping with systemic deficiencies and a more integrated assessment of safety in its societal context. In The Netherlands recognition of the importance of independent investigations has led to an expansion of this philosophy from the transport sector to other sectors. The philosophy now covers transport, industry, defense, natural disaster, environment and health and other major occurrences such as explosions, fires, and collapse of buildings or structures. In 2003 a multi-sector covering law will establish an independent safety board in The Netherlands. At a European level, mandatory investigation agencies are recognized as indispensable safety instruments for aviation, railways and the maritime sector, for which EU Directives are in place or being progressed [Transport accident and incident investigation in the European Union, European Transport Safety Council, ISBN 90-76024-10-3, Brussel, 2001]. Due to a series of major events, attention has been drawn to the consequences of disasters, highlighting the involvement of rescue and emergency services. They also have become subjected to investigative efforts, which in return, puts demands on investigation methodology. This paper comments on an evolutionary development in safety thinking and of safety boards, highlighting some consequences for strategic perspectives in a further development of independent accident investigation. PMID:15231346

  4. An Investigation of Different String Coding Methods.

    ERIC Educational Resources Information Center

    Goyal, Pankaj

    1984-01-01

    Investigates techniques for automatic coding of English language strings which involve titles drawn from bibliographic files, but do not require prior knowledge of source. Coding methods (basic, maximum entropy principle), results of test using 6,260 titles from British National Bibliography, and variations in code element ordering are…

  5. Independent Coding of Wind Direction in Cockroach Giant Interneurons

    E-print Network

    Libersat, Frederic

    Independent Coding of Wind Direction in Cockroach Giant Interneurons ADI MIZRAHI AND FREDERIC are located in the most posterior ganglion of the nerve direction in cockroach giant interneurons. J­3 ) control the stimulus by the cockroach cercal system. Such sensory processing initiation of a highly

  6. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  7. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T. [Sandia National Labs., Albuquerque, NM (United States). Environmental Risk Assessment and Risk Management Dept.; Goodrich, M.T. [IT Corp., Albuquerque, NM (United States)

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  8. CODE OF PRACTICE ON INVESTIGATIONS

    E-print Network

    Mottram, Nigel

    history 21 4.7. Medical records and other records relating to care, education etc 22 4.8 Research of criminal or other wrongful conduct 20 4.6 Biological and medical investigations and previous medical.1 Adverse events in any type of situation 27 12.2. Adverse events in clinical and other medical

  9. Species independence of mutual information in coding and noncoding DNA

    Microsoft Academic Search

    Ivo Grosse; Hanspeter Herzel; Sergey V. Buldyrev; H. Eugene Stanley

    2000-01-01

    We explore if there exist universal statistical patterns that are different in coding and noncoding DNA and can be found in all living organisms, regardless of their phylogenetic origin. We find that (i) the mutual information function I has a significantly different functional form in coding and noncoding DNA. We further find that (ii) the probability distributions of the average

  10. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  11. INVESTIGATION Coding Gene Single Nucleotide Polymorphism

    E-print Network

    Bernatchez, Louis

    Detection for Physiological Reproductive Traits in Brook Charr, Salvelinus fontinalis Christopher Sauvage for brook charr, Salvelinus fontinalis, using an F2 interstrain hybrid progeny (n = 171) and 256 coding gene QTL detection single nucleotide polymorphisms (SNP) reproduction Salvelinus fontinalis Linkages maps

  12. Two independent transcription initiation codes overlap on vertebrate core promoters

    NASA Astrophysics Data System (ADS)

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable their expression in the two regulatory environments. The dissection of overlapping core promoter determinants represents a framework for future studies of promoter structure and function across different regulatory contexts.

  13. RBMK coupled neutronics/thermal-hydraulics analyses by two independent code systems

    SciTech Connect

    Parisi, C.; D'Auria, F. [Univ. of Pisa, Dept. of Mechanical, Nuclear and Production Engineering, via Diotisalvi, 2, 56100 Pisa (Italy); Malofeev, V. [Kurchatov Inst., Kurchatov Square 1, Moscow 123182 (Russian Federation); Ivanov, B.; Ivanov, K. [Pennsylvania State Univ., RDFMG, 230 Reber Building, Univ. Park, PA 16802 (United States)

    2006-07-01

    This paper presents the coupled neutronics/thermal-hydraulics activities carried out in the framework of the part B of the TACIS project R2.03/97, 'Software development for accident analysis of RBMK reactors in Russia'. Two independent code systems were assembled, one from the Russian side and the other from the Western side, for studying RBMK core transients. The Russian code system relies on the use of code UNK for neutron data libraries generation and the three-dimensional neutron kinetics thermal-hydraulics coupled codes BARS-KORSAR for plant transient analyses. The Western code system is instead based on the lattice physics code HELIOS and on the RELAP5-3D C code. Several activities were performed for testing code system's capabilities: the neutron data libraries were calculated and verified by precise Monte Carlo calculations, the coupled codes' steady state results were compared with plant detectors' data, and calculations of several transients were compared. Finally, both code systems proved to have all the capabilities for addressing reliable safety analyses of RBMK reactors. (authors)

  14. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  15. PWS: an efficient code system for solving space-independent nuclear reactor dynamics

    Microsoft Academic Search

    A. E Aboanber; Y. M Hamada

    2002-01-01

    The reactor kinetics equations are reduced to a differential equation in matrix form convenient for explicit power series solution involving no approximations beyond the usual space-independent assumption. The coefficients of the series have been obtained from a straightforward recurrence relation. Numerical evaluation is performed by PWS (power series solution) code, written in Visual FORTRAN for a personal computer. The results

  16. Signal-independent timescale analysis (SITA) and its application for neural coding during reaching and walking

    PubMed Central

    Zacksenhouse, Miriam; Lebedev, Mikhail A.; Nicolelis, Miguel A. L.

    2014-01-01

    What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR) as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA) is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i) reaching experiments with Brain-Machine Interface (BMI), and (ii) locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations. PMID:25191263

  17. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    E-print Network

    Wen-Ye Liang; Shuang Wang; Hong-Wei Li; Zhen-Qiang Yin; Wei Chen; Yao Yao; Jing-Zheng Huang; Guang-Can Guo; Zheng-Fu Han

    2014-05-09

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust.

  18. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    PubMed Central

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50?km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

  19. The investigation of bandwidth efficient coding and modulation techniques

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.

  20. ''Independent Duckweed Investigations to Review the Scientific Method''

    NSDL National Science Digital Library

    Elleen Hutcheson (Rogers High School)

    2005-04-01

    This inquiry activity allows students to apply the scientific method to an investigation involving duckweed This inquiry activity was developed by a K-12 science teacher in the American Physiological SocietyÂ?s 2005 Frontiers in Physiology Program. The NSES Standards addressed by this activity are current as of the year of development. For more information on the Frontiers in Physiology Program, please visit www.frontiersinphys.org.

  1. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    NASA Astrophysics Data System (ADS)

    Szplet, R.; Jachna, Z.; Kwiatkowski, P.; Rozyc, K.

    2013-03-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines.

  2. RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1

    SciTech Connect

    Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)

    1996-10-01

    Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available.

  3. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  4. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization study is carried out using a geometric mean approach. Following this, sensitivity analyses with the aid of variance-based non-parametric approach and partial correlation coefficients are conducted using data available from surrogate models of the objectives and the multi-objective optima to identify the contribution of the design variables to the objective variability and to analyze the variability of the design variables and the objectives. In summary the present dissertation offers insight into an improved coarse to fine grid extrapolation technique for Navier-Stokes computations and also suggests tools for a designer to conduct design optimization study and related sensitivity analyses for a given design problem.

  5. Geoelectric variations related to earthquakes observed during a 3-year independent investigation

    NASA Astrophysics Data System (ADS)

    Tselentis, G.-Akis; Ifantis, Apostolos

    We present observations of the geoelectric field prior to some earthquakes. The data were collected during a three year (1992-1994) independent experimental investigation of VAN at the University of Patras Seismological Center. The recorded signals were: a) Gradual Variations of the Electric Field (GVEF), b) Periodic Variation of the Electric Field (PVEF), and c) Seismic Electric Signals (SES).

  6. An experimental investigation of a class of resistance-type, direction-independent wind turbines

    Microsoft Academic Search

    S. Sivasegaram

    1978-01-01

    The resistance-type, direction-independent wind turbine is suitable for the generation of power on a small scale in developing countries. So far, all work on this class of wind turbine seems to be restricted to the Savonius rotor. The present paper reports the findings of an experimental investigation of an entire class of wind turbines which includes the conventional Savonius rotor.

  7. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at…

  8. Investigation of Error Concealment Using Different Transform Codings and Multiple Description Codings

    NASA Astrophysics Data System (ADS)

    Farzamnia, Ali; Syed-Yusof, Sharifah K.; Fisal, Norsheila; Abu-Bakar, Syed A. R.

    2012-05-01

    There has been increasing usage of Multiple Description Coding (MDC) for error concealment in non-ideal channels. A lot of ideas have been masterminded for MDC method up to now. This paper described the attempts to conceal the error and reconstruct the lost descriptions caused by combining MDC and lapped orthogonal transform (LOT). In this work LOT and other transforms codings (DCT and wavelet) are used to decorrelate the image pixels in the transform domain. LOT has better performance at low bit rates in comparison to DCT and wavelet transform. The results show that MSE for the proposed methods in comparison to DCT and wavelet have decreased significantly. The PSNR values of reconstructed images are high. The subjective evaluation of image is very good and clear. Furthermore, the standard deviations of reconstructed images are very small especially in low capacity channels.

  9. What's in the Mix? Combining Coding and Conversation Analysis to Investigate Chat-Based Problem Solving

    ERIC Educational Resources Information Center

    Zemel, Alan; Xhafa, Fatos; Cakir, Murat

    2007-01-01

    Coding interactional data for statistical analysis presents theoretical, methodological and practical challenges. Coding schemes rely on categories that are decided by their relevance to the analytical problem under investigation. We suggest that (1) endogenous and publicly displayed concerns of participants provide for the observable organisation…

  10. An Early Underwater Artificial Vision Model in Ocean Investigations via Independent Component Analysis

    PubMed Central

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  11. Two independent retrotransposon insertions at the same site within the coding region of BTK.

    PubMed

    Conley, Mary Ellen; Partain, Julie D; Norland, Shannon M; Shurtleff, Sheila A; Kazazian, Haig H

    2005-03-01

    Insertion of endogenous retrotransposon sequences accounts for approximately 0.2% of disease causing mutations. These insertions are mediated by the reverse transcriptase and endonuclease activity of long interspersed nucleotide (LINE-1) elements. The factors that control the target site selection in insertional mutagenesis are not well understood. In our analysis of 199 unrelated families with proven mutations in BTK, the gene responsible for X-linked agammaglobulinemia, we identified two families with retrotransposon insertions at exactly the same nucleotide within the coding region of BTK. Both insertions, an SVA element and an AluY sequence, occurred 12 bp before the end of exon 9. Both had the typical hallmarks of a retrotransposon insertion including target site duplication and a long poly A tail. The insertion site is flanked by AluSx sequences 1 kb upstream and 1 kb downstream and an unusual 60 bp sequence consisting of only As and Ts is located in intron 9, 60 bp downstream of the insertion. The occurrence of two retrotransposon sequences at exactly the same site suggests that this site is vulnerable to insertional mutagenesis. A better understanding of the factors that make this site vulnerable will shed light on the mechanisms of LINE-1 mediated insertional mutagenesis. PMID:15712380

  12. Investigation of liquid crystal Fabry-Perot tunable filters: design, fabrication, and polarization independence.

    PubMed

    Isaacs, Sivan; Placido, Frank; Abdulhalim, Ibrahim

    2014-10-10

    Liquid crystal Fabry-Perot tunable filters are investigated in detail, with special attention to their manufacturability, design, tolerances, and polarization independence. The calculations were performed both numerically and analytically using the 4×4 propagation matrix method. A simplified analytic expression for the propagation matrix is derived for the case of nematic LC in the homogeneous geometry. At normal incidence, it is shown that one can use the 2×2 Abeles matrix method; however, at oblique incidence, the 4×4 matrix method is needed. The effects of dephasing originating from wedge or noncollimated light beams are investigated. Due to the absorption of the indium tin oxide layer and as an electrode, its location within the mirror multilayered stack is very important. The optimum location is found to be within the stack and not on its top or bottom. Finally, we give more detailed experimental results of our polarization-independent configuration that uses polarization diversity with a Wollaston prism. PMID:25322437

  13. Investigation of the Use of Erasures in a Concatenated Coding Scheme

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Marriott, Philip J.

    1997-01-01

    A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

  14. A model to investigate the mechanisms underlying the emergence and development of independent sitting.

    PubMed

    O'Brien, Kathleen M; Zhang, Jing; Walley, Philip R; Rhoads, Jeffrey F; Haddad, Jeffrey M; Claxton, Laura J

    2014-11-28

    When infants first begin to sit independently, they are highly unstable and unable to maintain upright sitting posture for more than a few seconds. Over the course of 3 months, the sitting ability of infants drastically improves. To investigate the mechanisms controlling the development of sitting posture, a single-degree-of-freedom inverted pendulum model was developed. Passive muscle properties were modeled with a stiffness and damping term, while active neurological control was modeled with a time-delayed proportional-integral-derivative (PID) controller. The findings of the simulations suggest that infants primarily utilize passive muscle stiffness to remain upright when they first begin to sit. This passive control mechanism allows the infant to remain upright so that active feedback control mechanisms can develop. The emergence of active control mechanisms allows infants to integrate sensory information into their movements so that they can exhibit more adaptive sitting. PMID:25442426

  15. ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS

    SciTech Connect

    Not Available

    1992-08-01

    There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

  16. Investigation of blood mRNA biomarkers for suicidality in an independent sample

    PubMed Central

    Mullins, N; Hodgson, K; Tansey, K E; Perroud, N; Maier, W; Mors, O; Rietschel, M; Hauser, J; Henigsberg, N; Souery, D; Aitchison, K; Farmer, A; McGuffin, P; Breen, G; Uher, R; Lewis, C M

    2014-01-01

    Changes in the blood expression levels of SAT1, PTEN, MAP3K3 and MARCKS genes have been reported as biomarkers of high versus low suicidality state (Le-Niculescu et al.). Here, we investigate these expression biomarkers in the Genome-Based Therapeutic Drugs for Depression (GENDEP) study, of patients with major depressive disorder on a 12-week antidepressant treatment. Blood gene expression levels were available at baseline and week 8 for patients who experienced suicidal ideation during the study (n=20) versus those who did not (n=37). The analysis is well powered to detect the effect sizes reported in the original paper. Within either group, there was no significant change in the expression of these four genes over the course of the study, despite increasing suicidal ideation or initiation of antidepressant treatment. Comparison of the groups showed that the gene expression did not differ between patients with or without treatment-related suicidality. This independent study does not support the validity of the proposed biomarkers. PMID:25350297

  17. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  18. Investigating the Magnetorotational Instability with Dedalus, and Open-Souce Hydrodynamics Code

    SciTech Connect

    Burns, Keaton J; /UC, Berkeley, aff SLAC

    2012-08-31

    The magnetorotational instability is a fluid instability that causes the onset of turbulence in discs with poloidal magnetic fields. It is believed to be an important mechanism in the physics of accretion discs, namely in its ability to transport angular momentum outward. A similar instability arising in systems with a helical magnetic field may be easier to produce in laboratory experiments using liquid sodium, but the applicability of this phenomenon to astrophysical discs is unclear. To explore and compare the properties of these standard and helical magnetorotational instabilities (MRI and HRMI, respectively), magnetohydrodynamic (MHD) capabilities were added to Dedalus, an open-source hydrodynamics simulator. Dedalus is a Python-based pseudospectral code that uses external libraries and parallelization with the goal of achieving speeds competitive with codes implemented in lower-level languages. This paper will outline the MHD equations as implemented in Dedalus, the steps taken to improve the performance of the code, and the status of MRI investigations using Dedalus.

  19. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  20. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  1. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  2. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  3. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-07-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  4. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-11-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  5. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-07-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  6. Coding for stable transmission of W-band radio-over-fiber system using direct-beating of two independent lasers.

    PubMed

    Yang, L G; Sung, J Y; Chow, C W; Yeh, C H; Cheng, K T; Shi, J W; Pan, C L

    2014-10-20

    We demonstrate experimentally Manchester (MC) coding based W-band (75 - 110 GHz) radio-over-fiber (ROF) system to reduce the low-frequency-components (LFCs) signal distortion generated by two independent low-cost lasers using spectral shaping. Hence, a low-cost and higher performance W-band ROF system is achieved. In this system, direct-beating of two independent low-cost CW lasers without frequency tracking circuit (FTC) is used to generate the millimeter-wave. Approaches, such as delayed self-heterodyne interferometer and heterodyne beating are performed to characterize the optical-beating-interference sub-terahertz signal (OBIS). Furthermore, W-band ROF systems using MC coding and NRZ-OOK are compared and discussed. PMID:25401641

  7. An investigation of design optimization using a 2-D viscous flow code with multigrid

    NASA Technical Reports Server (NTRS)

    Doria, Michael L.

    1990-01-01

    Computational fluid dynamics (CFD) codes have advanced to the point where they are effective analytical tools for solving flow fields around complex geometries. There is also a need for their use as a design tool to find optimum aerodynamic shapes. In the area of design, however, a difficulty arises due to the large amount of computer resources required by these codes. It is desired to streamline the design process so that a large number of design options and constraints can be investigated without overloading the system. There are several techniques which have been proposed to help streamline the design process. The feasibility of one of these techniques is investigated. The technique under consideration is the interaction of the geometry change with the flow calculation. The problem of finding the value of camber which maximizes the ratio of lift over drag for a particular airfoil is considered. In order to test out this technique, a particular optimization problem was tried. A NACA 0012 airfoil was considered at free stream Mach number of 0.5 with a zero angle of attack. Camber was added to the mean line of the airfoil. The goal was to find the value of camber for which the ratio of lift over drag is a maximum. The flow code used was FLOMGE which is a two dimensional viscous flow solver which uses multigrid to speed up convergence. A hyperbolic grid generation program was used to construct the grid for each value of camber.

  8. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry.

    PubMed

    Bernal, M A; Liendo, J A

    2009-02-01

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of gamma rays originated from Co60 and Cs137 isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs137, AlK and CK radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field. PMID:19292002

  9. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  10. Culture-Dependent and Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages

    Microsoft Academic Search

    Kalliopi Rantsiou; Rosalinda Urso; Lucilla Iacumin; Carlo Cantoni; Patrizia Cattaneo; Giuseppe Comi; Luca Cocolin

    2005-01-01

    In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to

  11. Two mitochondrial genomes from the families Bethylidae and Mutillidae: independent rearrangement of protein-coding genes and higher-level phylogeny of the Hymenoptera.

    PubMed

    Wei, Shu-Jun; Li, Qian; van Achterberg, Kees; Chen, Xue-Xin

    2014-08-01

    In animal mitochondrial genomes, gene arrangements are usually conserved across major lineages but might be rearranged within derived groups, and might provide valuable phylogenetic characters. Here, we sequenced the mitochondrial genomes of Cephalonomia gallicola (Chrysidoidea: Bethylidae) and Wallacidia oculata (Vespoidea: Mutillidae). In Cephalonomia at least 11 tRNA and 2 protein-coding genes were rearranged, which is the first report of protein-coding gene rearrangements in the Aculeata. In the Hymenoptera, three types of protein-coding gene rearrangement events occur, i.e. reversal, transposition and reverse transposition. Venturia (Ichneumonidae) had the greatest number of common intervals with the ancestral gene arrangement pattern, whereas Philotrypesis (Agaonidae) had the fewest. The most similar rearrangement patterns are shared between Nasonia (Pteromalidae) and Philotrypesis, whereas the most differentiated rearrangements occur between Cotesia (Braconidae) and Philotrypesis. It is clear that protein-coding gene rearrangements in the Hymenoptera are evolutionarily independent across the major lineages but are conserved within groups such as the Chalcidoidea. Phylogenetic analyses supported the sister-group relationship of Orrussoidea and Apocrita, Ichneumonoidea and Aculeata, Vespidae and Apoidea, and the paraphyly of Vespoidea. The Evaniomorpha and phylogenetic relationships within Aculeata remain controversial, with discrepancy between analyses using protein-coding and RNA genes. PMID:24704304

  12. What''s in the mix? Combining coding and conversation analysis to investigate chat-based problem-sol

    Microsoft Academic Search

    Alan Zemel; Fatos Xhafa; Murat Cakir

    The coding of interactional data for statistical analysis presents theoretical and practical challenges. Coding schemes that rely on categories that are decided by their relevance to the analytical problem under investigation carry with them the presumption that the analyst's perspective and concerns are privileged over the demonstrable and publicly displayed perspectives and concerns of the participants in the interaction. Furthermore,

  13. Investigation of in-band transmission of both spectral amplitude coding/optical code division multiple-access and wavelength division multiplexing signals

    NASA Astrophysics Data System (ADS)

    Ashour, Isaac A. M.; Shaari, Sahbudin; Shalaby, Hossam M. H.; Menon, P. Susthitha

    2011-06-01

    The transmission of both optical code division multiple-access (OCDMA) and wavelength division multiplexing (WDM) users on the same band is investigated. Code pulses of spectral amplitude coding (SAC)/optical code division multiple-access (CDMA) are overlaid onto a multichannel WDM system. Notch filters are utilized in order to suppress the WDM interference signals for detection of optical broadband CDMA signals. Modified quadratic congruence (MQC) codes are used as the signature codes for the SAC/OCDMA system. The proposed system is simulated and its performance in terms of both the bit-error rate and Q-factor are determined. In addition, eavesdropper probability of error-free code detection is evaluated. Our results are compared to traditional nonhybrid systems. It is concluded that the proposed hybrid scheme still achieves acceptable performance. In addition, it provides enhanced data confidentiality as compared to the scheme with SAC/OCDMA only. It is also shown that the performance of the proposed system is limited by the interference of the WDM signals. Furthermore, the simulation illustrates the tradeoff between the performance and confidentiality for authorized users.

  14. Nye County Nuclear Waste Repository Project Office independent scientific investigations program annual report, May 1997--April 1998

    SciTech Connect

    NONE

    1998-07-01

    This annual summary report, prepared by the Nye County Nuclear Waste Repository Project Office (NWRPO), summarizes the activities that were performed during the period from May 1, 1997 to April 30, 1998. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment; identifying areas not being addressed adequately by the Department of Energy (DOE). Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues. This report summarizes the results of monitoring from two boreholes and the Exploratory Studies Facility (ESF) tunnel that have been instrumented by Nye County since March and April of 1995. The preliminary data and interpretations presented in this report do not constitute and should not be considered as the official position of Nye County. The ISIP presently includes borehole and tunnel instrumentation, monitoring, data analysis, and numerical modeling activities to address the concerns of Nye County.

  15. Culture-Dependent and -Independent Investigations of Microbial Diversity on Urinary Catheters

    PubMed Central

    Xu, Yijuan; Moser, Claus; Al-Soud, Waleed Abu; Sørensen, Søren; Høiby, Niels; Nielsen, Per Halkjær

    2012-01-01

    Catheter-associated urinary tract infection is caused by bacteria, which ascend the catheter along its external or internal surface to the bladder and subsequently develop into biofilms on the catheter and uroepithelium. Antibiotic-treated bacteria and bacteria residing in biofilm can be difficult to culture. In this study we used culture-based and 16S rRNA gene-based culture-independent methods (fingerprinting, cloning, and pyrosequencing) to determine the microbial diversity of biofilms on 24 urinary catheters. Most of the patients were catheterized for <30 days and had undergone recent antibiotic treatment. In addition, the corresponding urine samples for 16 patients were cultured. We found that gene analyses of the catheters were consistent with cultures of the corresponding urine samples for the presence of bacteria but sometimes discordant for the identity of the species. Cultures of catheter tips detected bacteria more frequently than urine cultures and gene analyses; coagulase-negative staphylococci were, in particular, cultured much more often from catheter tips, indicating potential contamination of the catheter tips during sampling. The external and internal surfaces of 19 catheters were separately analyzed by molecular methods, and discordant results were found in six catheters, suggesting that bacterial colonization intra- and extraluminally may be different. Molecular analyses showed that most of the species identified in this study were known uropathogens, and infected catheters were generally colonized by one to two species, probably due to antibiotic usage and short-term catheterization. In conclusion, our data showed that culture-independent molecular methods did not detect bacteria from urinary catheters more frequently than culture-based methods. PMID:23015674

  16. Investigation on series of length of coding and non-coding DNA sequences of bacteria using multifractal detrended cross-correlation analysis.

    PubMed

    Stan, Cristina; Cristescu, Monica Teodora; Luiza, Buimaga Iarinca; Cristescu, C P

    2013-03-21

    In the framework of multifractal detrended cross-correlation analysis, we investigate characteristics of series of length of coding and non-coding DNA sequences of some bacteria and archaea. We propose the use of a multifractal cross-correlation series that can be defined for any pair of equal lengths data sequences (or time series) and that can be characterized by the full set of parameters that are attributed to any time series. Comparison between characteristics of series of length of coding and non-coding DNA sequences and of their associated multifractal cross-correlation series for selected groups is used for the identification of class affiliation of certain bacteria and archaea. The analysis is carried out using the dependence of the generalized Hurst exponent on the size of fluctuations, the shape of the singularity spectra, the shape and relative disposition of the curves of the singular measures scaling exponent and the values of the associated parameters. Empirically, we demonstrate that the series of lengths of coding and non-coding sequences as well as the associated multifractal cross-correlation series can be approximated as universal multifractals. PMID:23313335

  17. Culture-Dependent and -Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages

    PubMed Central

    Rantsiou, Kalliopi; Urso, Rosalinda; Iacumin, Lucilla; Cantoni, Carlo; Cattaneo, Patrizia; Comi, Giuseppe; Cocolin, Luca

    2005-01-01

    In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to the particular transformation process. Yeast counts were higher than the detection limit (>100 CFU/g) in only one fermented sausage. Analysis of the denaturing gradient gel electrophoresis (DGGE) patterns and sequencing of the bands allowed profiling of the microbial populations present in the sausages during fermentation. The bacterial ecology was mainly characterized by the stable presence of Lactobacillus curvatus and Lactobacillus sakei, but Lactobacillus paracasei was also repeatedly detected. An important piece of evidence was the presence of Lactococcus garvieae, which clearly contributed in two fermentations. Several species of Staphylococcus were also detected. Regarding other bacterial groups, Bacillus sp., Ruminococcus sp., and Macrococcus caseolyticus were also identified at the beginning of the transformations. In addition, yeast species belonging to Debaryomyces hansenii, several Candida species, and Willopsis saturnus were observed in the DGGE gels. Finally, cluster analysis of the bacterial and yeast DGGE profiles highlighted the uniqueness of the fermentation processes studied. PMID:15812029

  18. Culture-Independent Investigation of the Microbiome Associated with the Nematode Acrobeloides maximus

    PubMed Central

    Baquiran, Jean-Paul; Thater, Brian; Sedky, Sammy; De Ley, Paul; Crowley, David; Orwin, Paul M.

    2013-01-01

    Background Symbioses between metazoans and microbes are widespread and vital to many ecosystems. Recent work with several nematode species has suggested that strong associations with microbial symbionts may also be common among members of this phylu. In this work we explore possible symbiosis between bacteria and the free living soil bacteriovorous nematode Acrobeloides maximus. Methodology We used a soil microcosm approach to expose A. maximus populations grown monoxenically on RFP labeled Escherichia coli in a soil slurry. Worms were recovered by density gradient separation and examined using both culture-independent and isolation methods. A 16S rRNA gene survey of the worm-associated bacteria was compared to the soil and to a similar analysis using Caenorhabditis elegans N2. Recovered A. maximus populations were maintained on cholesterol agar and sampled to examine the population dynamics of the microbiome. Results A consistent core microbiome was extracted from A. maximus that differed from those in the bulk soil or the C. elegans associated set. Three genera, Ochrobactrum, Pedobacter, and Chitinophaga, were identified at high levels only in the A. maximus populations, which were less diverse than the assemblage associated with C. elegans. Putative symbiont populations were maintained for at least 4 months post inoculation, although the levels decreased as the culture aged. Fluorescence in situ hybridization (FISH) using probes specific for Ochrobactrum and Pedobacter stained bacterial cells in formaldehyde fixed nematode guts. Conclusions Three microorganisms were repeatedly observed in association with Acrobeloides maximus when recovered from soil microcosms. We isolated several Ochrobactrum sp. and Pedobacter sp., and demonstrated that they inhabit the nematode gut by FISH. Although their role in A. maximus is not resolved, we propose possible mutualistic roles for these bacteria in protection of the host against pathogens and facilitating enzymatic digestion of other ingested bacteria. PMID:23894287

  19. The City Coding Project : an investigation into some presumed maxims for residential design in Hong Kong

    E-print Network

    Wong, Chit Kin Dickson

    2008-01-01

    Formal expressions of architecture in a city are largely dictated by how the city is 'coded' ... re-coding - is capable of making fundamental changes in building forms that would proliferate across the entire city. Therefore, ...

  20. Investigation of low temperature solid oxide fuel cells for air-independent UUV applications

    NASA Astrophysics Data System (ADS)

    Moton, Jennie Mariko

    Unmanned underwater vehicles (UUVs) will benefit greatly from high energy density (> 500 Wh/L) power systems utilizing high-energy-density fuels and air-independent oxidizers. Current battery-based systems have limited energy densities (< 400 Wh/L), which motivate development of alternative power systems such as solid oxide fuel cells (SOFCs). SOFC-based power systems have the potential to achieve the required UUV energy densities, and the current study explores how SOFCs based on gadolinia-doped ceria (GDC) electrolytes with operating temperatures of 650°C and lower may operate in the unique environments of a promising UUV power plant. The plant would contain a H 2O2 decomposition reactor to supply humidified O2 to the SOFC cathode and exothermic aluminum/H2O combustor to provide heated humidified H2 fuel to the anode. To characterize low-temperature SOFC performance with these unique O2 and H2 source, SOFC button cells based on nickel/GDC (Gd0.1Ce0.9O 1.95) anodes, GDC electrolytes, and lanthanum strontium cobalt ferrite (La0.6Sr0.4Co0.2Fe0.8O3-? or LSCF)/GDC cathodes were fabricated and tested for performance and stability with humidity on both the anode and the cathode. Cells were also tested with various reactant concentrations of H2 and O2 to simulate gas depletion down the channel of an SOFC stack. Results showed that anode performance depended primarily on fuel concentration and less on the concentration of the associated increase in product H2O. O 2 depletion with humidified cathode flows also caused significant loss in cell current density at a given voltage. With the humidified flows in either the anode or cathode, stability tests of the button cells at 650 °C showed stable voltage is maintained at low operating current (0.17 A/cm2) at up to 50 % by mole H2O, but at higher current densities (0.34 A/cm2), irreversible voltage degradation occurred at rates of 0.8-3.7 mV/hour depending on exposure time. From these button cell results, estimated average current densities over the length of a low-temperature SOFC stack were estimated and used to size a UUV power system based on Al/H 2O oxidation for fuel and H2O2 decomposition for O2. The resulting system design suggested that energy densities above 300 Wh/L may be achieved at neutral buoyancy with seawater if the cell is operated at high reactant utilizations in the SOFC stack for missions longer than 20 hours.

  1. Further Investigation of Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2006-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in designing and assessing low-noise concepts. This work begins a systematic study to identify the areas of the design space in which propagation codes of varying fidelity may be used effectively to provide efficient design and assessment. An efficient lower-fidelity code is used in conjunction with two higher-fidelity, more computationally intensive methods to solve benchmark problems of increasing complexity. The codes represent a small sampling of the current propagation codes available or under development. Results of this initial study indicate that the lower-fidelity code provides satisfactory results for cases involving low to moderate attenuation rates, whereas, the two higher-fidelity codes perform well across the range of problems.

  2. Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database

    SciTech Connect

    Bogen, K.T. [Lawrence Livermore National Lab., CA (United States); Seilkop, S. [Analytical Sciences, Inc., Durham, NC (United States)

    1993-05-01

    Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

  3. Investigating the use of quick response codes in the gross anatomy laboratory.

    PubMed

    Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B

    2014-10-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n?=?155) and doctor of physical therapy (n?=?39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P?=?0.113), and no significant gains in performance (P?=?0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. Anat Sci Educ. © 2014 American Association of Anatomists. PMID:25288343

  4. Role Asymmetry and Code Transmission in Signaling Games: An Experimental and Computational Investigation.

    PubMed

    Moreno, Maggie; Baggio, Giosuè

    2014-10-29

    In signaling games, a sender has private access to a state of affairs and uses a signal to inform a receiver about that state. If no common association of signals and states is initially available, sender and receiver must coordinate to develop one. How do players divide coordination labor? We show experimentally that, if players switch roles at each communication round, coordination labor is shared. However, in games with fixed roles, coordination labor is divided: Receivers adjust their mappings more frequently, whereas senders maintain the initial code, which is transmitted to receivers and becomes the common code. In a series of computer simulations, player and role asymmetry as observed experimentally were accounted for by a model in which the receiver in the first signaling round has a higher chance of adjusting its code than its partner. From this basic division of labor among players, certain properties of role asymmetry, in particular correlations with game complexity, are seen to follow. PMID:25352016

  5. Analytical Investigation on Papr Reduction in OFDM Systems Using Golay Codes

    NASA Astrophysics Data System (ADS)

    Uppal, Sabhyata; Sharma, Sanjay; Singh, Hardeep

    2014-09-01

    Orthogonal frequency division multiplexing (OFDM) is a common technique in multi carrier communications. One of the major issues in developing OFDM is the high peak to average power ratio (PAPR). Golay sequences have been introduced to construct 16-QAM and 256-QAM (quadrature amplitude modulation) code for the orthogonal frequency division multiplexing (OFDM), reducing the peak-to-average power ratio. In this paper we have considered the use of coding to reduce the peakto- average power ratio (PAPR) for orthogonal frequency division multiplexing (OFDM) systems. By using QPSK Golay sequences, 16 and 256 QAM sequences with low PAPR are generated

  6. Digital sound: An investigation of delta-modulation\\/pulse-code-modulation analogue-to-digital conversion

    Microsoft Academic Search

    R. A. Belcher; W. I. Manson; T. A. Moore

    1980-01-01

    An apparatus was built to allow the potential of a sound signal delta modulation to pulse code modulation converter to be assessed practically. An 8 MHz 8 bit digital delta modulation integrator was tested. Addition of triangular dither to the studio input signal was found to improve performance markedly. Results show that a system for which 10 1\\/2 bit pulse

  7. Independent Learning: A Common Essential Learning: A Study Completed for the Saskatchewan Department of Education Core Curriculum Investigation Project.

    ERIC Educational Resources Information Center

    Kesten, Cyril

    The concept of independent learning is discussed. Independent learning is defined as learning in which the learner, in conjunction with relevant others, can make the decisions necessary to meet his or her own learning needs. This must be regarded as a direction or goal to be pursued, not as an absolute standard and not as a set of identifiable…

  8. Investigation of accident cases for high pressure, high temperature experimental helium loop using RELAP5-3D code

    Microsoft Academic Search

    Xue Zhou Jin; Bradut-Eugen Ghidersa

    2011-01-01

    Accident cases are investigated for the Helium Loop Karlsruhe (HELOKA) facility, a high pressure and high temperature experimental helium loop having the European Helium Cooled Pebble Beds (HCPB) Test Blanket Module (TBM) as test module. Two typical operation modes for the loop operation have been numerically modeled using RELAP5-3D code: a pulsed operation (ITER-like situation) and a steady state operation

  9. Investigating the impact of the cielo cray XE6 architecture on scientific application codes.

    SciTech Connect

    Rajan, Mahesh; Barrett, Richard; Pedretti, Kevin Thomas Tauke; Doerfler, Douglas W.; Vaughan, Courtenay Thomas

    2010-12-01

    Cielo, a Cray XE6, is the Department of Energy NNSA Advanced Simulation and Computing (ASC) campaign's newest capability machine. Rated at 1.37 PFLOPS, it consists of 8,944 dual-socket oct-core AMD Magny-Cours compute nodes, linked using Cray's Gemini interconnect. Its primary mission objective is to enable a suite of the ASC applications implemented using MPI to scale to tens of thousands of cores. Cielo is an evolutionary improvement to a successful architecture previously available to many of our codes, thus enabling a basis for understanding the capabilities of this new architecture. Using three codes strategically important to the ASC campaign, and supplemented with some micro-benchmarks that expose the fundamental capabilities of the XE6, we report on the performance characteristics and capabilities of Cielo.

  10. An Investigation of Two Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, D. M.; Watson, W. R.; Jones, M. G.

    2005-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in studying low-noise designs. Recent years have seen the development of aeroacoustic propagation codes using various levels of approximation to obtain such a capability. In light of this, it is beneficial to pursue a design paradigm that incorporates the strengths of the various tools. The development of a quasi-3D methodology (Q3D-FEM) at NASA Langley has brought these ideas to mind in relation to the framework of the CDUCT-LaRC acoustic propagation and radiation tool. As more extensive three dimensional codes become available, it would seem appropriate to incorporate these tools into a framework similar to CDUCT-LaRC and use them in a complementary manner. This work focuses on such an approach in beginning the steps toward a systematic assessment of the errors, and hence the trade-offs, involved in the use of these codes. To illustrate this point, CDUCT-LaRC was used to study benchmark hardwall duct problems to quantify errors caused by wave propagation in directions far removed from that defined by the parabolic approximation. Configurations incorporating acoustic treatment were also studied with CDUCT-LaRC and Q3D-FEM. The cases presented show that acoustic treatment diminishes the effects of CDUCT-LaRC phase error as the solutions are attenuated. The results of the Q3D-FEM were very promising and matched the analytic solution very well. Overall, these tests were meant to serve as a step toward the systematic study of errors inherent in the propagation module of CDUCT-LaRC, as well as an initial test of the higher fidelity Q3D-FEM code.

  11. Field-programmable gate-array-based investigation of the error floor of low-density parity check codes for magnetic recording channels

    Microsoft Academic Search

    Lingyan Sun; Hongwei Song; B. V. K. V. Kumar; Zak Keirn

    2005-01-01

    Good performance of iterative detection and decoding using low-density parity check (LDPC) codes has stimulated great interest in the data storage industry. One major concern in using LDPC codes in the read channel is their error floor, which is still an open question. To investigate the performance of LDPC codes in low bit-error rates (BER?10-10), we developed a high-throughput fully

  12. National evaluation of the benefits and risks of greater structuring and coding of the electronic health record: exploratory qualitative investigation

    PubMed Central

    Morrison, Zoe; Fernando, Bernard; Kalra, Dipak; Cresswell, Kathrin; Sheikh, Aziz

    2014-01-01

    Objective We aimed to explore stakeholder views, attitudes, needs, and expectations regarding likely benefits and risks resulting from increased structuring and coding of clinical information within electronic health records (EHRs). Materials and methods Qualitative investigation in primary and secondary care and research settings throughout the UK. Data were derived from interviews, expert discussion groups, observations, and relevant documents. Participants (n=70) included patients, healthcare professionals, health service commissioners, policy makers, managers, administrators, systems developers, researchers, and academics. Results Four main themes arose from our data: variations in documentation practice; patient care benefits; secondary uses of information; and informing and involving patients. We observed a lack of guidelines, co-ordination, and dissemination of best practice relating to the design and use of information structures. While we identified immediate benefits for direct care and secondary analysis, many healthcare professionals did not see the relevance of structured and/or coded data to clinical practice. The potential for structured information to increase patient understanding of their diagnosis and treatment contrasted with concerns regarding the appropriateness of coded information for patients. Conclusions The design and development of EHRs requires the capture of narrative information to reflect patient/clinician communication and computable data for administration and research purposes. Increased structuring and/or coding of EHRs therefore offers both benefits and risks. Documentation standards within clinical guidelines are likely to encourage comprehensive, accurate processing of data. As data structures may impact upon clinician/patient interactions, new models of documentation may be necessary if EHRs are to be read and authored by patients. PMID:24186957

  13. Investigation into the flow field around a maneuvering submarine using a Reynolds-averaged Navier-Stokes code

    NASA Astrophysics Data System (ADS)

    Rhee, Bong

    The accurate and efficient prediction of hydrodynamic forces and moments on a maneuvering submarine has been achieved by investigating the flow physics involving the interaction of the vortical flow shed from the sail and the cross-flow boundary layer of the hull. In this investigation, a Reynolds-Averaged Navier-Stokes (RANS) computer code is used to simulate the most important physical effects related to maneuvering. It is applied to a generic axisymmetric body with the relatively simple case of the flow around an unappended hull at an angle of attack. After the code is validated for this simple case, it is validated for the case of a submarine with various appendages attached to the hull moving at an angle of drift. All six components of predicted forces and moments for various drift angles are compared with experimental data. Calculated pressure coefficients along the azimuthal angle are compared with experimental data and discussed to show the effect of the sail and the stern appendages. To understand the main flow features for a submarine in a straight flight, the RANS code is applied to simulate SUBOFF axisymmetric body at zero angle of attack in a straight-line basin. Pressure coefficient, skin friction coefficient, mean velocity components and the Reynolds shear stresses are compared with experimental data and discussed. The physical aspects of the interaction between the vortical flow shed by the sail and the cross-flow boundary layer on the hull are explained in greater detail. The understanding of this interaction is very important to characterize accurately the hydrodynamic behavior of a maneuvering submarine.

  14. Investigation of wellbore cooling by circulation and fluid penetration into the formation using a wellbore thermal simulator computer code

    SciTech Connect

    Duda, L.E.

    1985-01-01

    The high temperatures of geothermal wells present severe problems for drilling, logging, and developing these reservoirs. Cooling the wellbore is perhaps the most common method to solve these problems. However, it is usually not clear what may be the most effective wellbore cooling mechanism for a given well. In this paper, wellbore cooling by the use of circulation or by fluid injection into the surrounding rock is investigated using a wellbore thermal simulator computer code. Short circulation times offer no prolonged cooling of fluid in the wellbore, but long circulation times (greater than ten or twenty days) greatly reduce the warming rate after shut-in. The dependence of the warming rate on the penetration distance of cooler temperatures into the rock formation (as by fluid injection) is investigated. Penetration distances of greater than 0.6 m appear to offer a substantial reduction in the warming rate. Several plots are shown which demonstrate these effects. 16 refs., 6 figs.

  15. Investigations on the sensitivity of the computer code TURBO-2D

    NASA Astrophysics Data System (ADS)

    Amon, B.

    1994-12-01

    The two-dimensional computer model TURBO-2D for the calculation of two-phase flow was used to calculate the cold injection of fuel into a model chamber. Investigations of the influence of the input parameter on its sensitivity relative to the obtained results were made. In addition to that calculations were performed and compared using experimental injection pressure data and corresponding averaged injection parameter.

  16. Polyphasic Study of the Spatial Distribution of Microorganisms in Mexican Pozol, a Fermented Maize Dough, Demonstrates the Need for Cultivation-Independent Methods To Investigate Traditional Fermentations

    Microsoft Academic Search

    FREDERIC AMPE; NABIL BEN OMAR; CLAIRE MOIZAN; CARMEN WACHER

    1999-01-01

    The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polypha- sic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligo- nucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our

  17. Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors

    NASA Astrophysics Data System (ADS)

    Lee, Y.-K.; Brun, E.

    2014-04-01

    The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 ± 17 pcm and 1208 ± 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

  18. Combining independent component analysis and Granger causality to investigate brain network dynamics with fNIRS measurements

    PubMed Central

    Yuan, Zhen

    2013-01-01

    In this study a new strategy that combines Granger causality mapping (GCM) and independent component analysis (ICA) is proposed to reveal complex neural network dynamics underlying cognitive processes using functional near infrared spectroscopy (fNIRS) measurements. The GCM-ICA algorithm implements the following two procedures: (i) extraction of the region of interests (ROIs) of cortical activations by ICA, and (ii) estimation of the direct causal influences in local brain networks using Granger causality among voxels of ROIs. Our results show that the use of GCM in conjunction with ICA is able to effectively identify the directional brain network dynamics in time-frequency domain based on fNIRS recordings. PMID:24298421

  19. Utilization of a Photon Transport Code to Investigate Radiation Therapy Treatment Planning Quantities and Techniques.

    NASA Astrophysics Data System (ADS)

    Palta, Jatinder Raj

    A versatile computer program MORSE, based on neutron and photon transport theory has been utilized to investigate radiation therapy treatment planning quantities and techniques. A multi-energy group representation of transport equation provides a concise approach in utilizing Monte Carlo numerical techniques to multiple radiation therapy treatment planning problems. A general three dimensional geometry is used to simulate radiation therapy treatment planning problems in configurations of an actual clinical setting. Central axis total and scattered dose distributions for homogeneous and inhomogeneous water phantoms are calculated and the correction factor for lung and bone inhomogeneities are also evaluated. Results show that Monte Carlo calculations based on multi-energy group transport theory predict the depth dose distributions that are in good agreement with available experimental data. Improved correction factors based on the concepts of lung-air-ratio and bone-air-ratio are proposed in lieu of the presently used correction factors that are based on tissue-air-ratio power law method for inhomogeneity corrections. Central axis depth dose distributions for a bremsstrahlung spectrum from a linear accelerator is also calculated to exhibit the versatility of the computer program in handling multiple radiation therapy problems. A novel approach is undertaken to study the dosimetric properties of brachytherapy sources. Dose rate constants for various radionuclides are calculated from the numerically generated dose rate versus source energy curves. Dose rates can also be generated for any point brachytherapy source with any arbitrary energy spectrum at various radial distances from this family of curves.

  20. Manipulation of independent synthesis and degradation of polyphosphate in Escherichia coli for investigation of phosphate secretion from the cell.

    PubMed Central

    Van Dien, S J; Keyhani, S; Yang, C; Keasling, J D

    1997-01-01

    The genes involved in polyphosphate metabolism in Escherichia coli were cloned behind different inducible promoters on separate plasmids. The gene coding for polyphosphate kinase (PPK), the enzyme responsible for polyphosphate synthesis, was placed behind the Ptac promoter. Polyphosphatase, a polyphosphate depolymerase, was similarly expressed by using the arabinose-inducible PBAD promoter. The ability of cells containing these constructs to produce active enzymes only when induced was confirmed by polyphosphate extraction, enzyme assays, and RNA analysis. The inducer concentrations giving optimal expression of each enzyme were determined. Experiments were performed in which ppk was induced early in growth, overproducing PPK and allowing large amounts of polyphosphate to accumulate (80 mumol in phosphate monomer units per g of dry cell weight). The ppx gene was subsequently induced, and polyphosphate was degraded to inorganic phosphate. Approximately half of this polyphosphate was depleted in 210 min. The phosphate released from polyphosphate allowed the growth of phosphate-starved cells and was secreted into the medium, leading to a down-regulation of the phosphate-starvation response. In addition, the steady-state polyphosphate level was precisely controlled by manipulating the degree of ppx induction. The polyphosphate content varied from 98 to 12 mumol in phosphate monomer units per g of dry cell weight as the arabinose concentration was increased from 0 to 0.02% by weight. PMID:9143103

  1. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  2. Performance Investigation of Soft-Decodable Runlength-Limited Codes With Different Minimum Runlength Constraints in High-Density Optical Recording

    Microsoft Academic Search

    Haibin Zhang; Andries P. Hekstra; Wim M. J. Coene; Bin Yin

    2007-01-01

    This paper investigates the performance of runlength-limited (RLL) codes with different d constraints in high-density optical recording, for both the low-density parity-check (LDPC)-coded Bliss scheme and the standard concatenation scheme. For the d = 5 constraint, we propose a low-complexity compression\\/decompression scenario that maps each 6-bit channel codeword to a 3-bit \\

  3. Why comply with a code of ethics?

    PubMed

    Spielthenner, Georg

    2015-05-01

    A growing number of professional associations and occupational groups are creating codes of ethics with the goal of guiding their members, protecting service users, and safeguarding the reputation of the profession. There is a great deal of literature dealing with the question to what extent ethical codes can achieve their desired objectives. The present paper does not contribute to this debate. Its aim is rather to investigate how rational it is to comply with codes of conduct. It is natural and virtually inevitable for a reflective person to ask why one should pay any attention to ethical codes, in particular if following a code is not in one's own interest. In order to achieve the aim of this paper, I shall (in "Quasi-reasons for complying with an ethical code" section) discuss reasons that only appear to be reasons for complying with a code. In "Code-independent reasons" section, I shall present genuine practical reasons that, however, turn out to be reasons of the wrong kind. In "Code-dependent reasons" section finally presents the most important reasons for complying with ethical codes. The paper argues that while ethical codes do not necessarily yield reasons for action, professionals can have genuine reasons for complying with a code, which may, however, be rather weak and easily overridden by reasons for deviating from the code. PMID:25185873

  4. Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii

    SciTech Connect

    Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey

    1991-07-01

    On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

  5. Independent assessment of TRAC-PF1 (Version 7. 0), RELAP5/MOD1 (Cycle 14), and TRAC-BD1 (Version 12. 0) codes using separate-effects experiments

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G C; Yuelys-Miksis, C

    1985-08-01

    This report presents the results of independent code assessment conducted at BNL. The TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed using the critical flow tests, level swell test, countercurrent flow limitation (CCFL) tests, post-CHF test, steam generator thermal performance tests, and natural circulation tests. TRAC-BD1 (Version 12.0) was applied only to the CCFL and post-CHF tests. The TRAC-PWR series of codes, i.e., TRAC-P1A, TRAC-PD2, and TRAC-PF1, have been gradually improved. However, TRAC-PF1 appears to need improvement in almost all categories of tests/phenomena attempted to BNL. Of the two codes, TRAC-PF1 and RELAP5/MOD1, the latter needs more improvement particularly in the areas of: CCFL, Level swell, CHF correlation and post-CHF heat transfer, and Numerical stability. For the CCFL and post-CHF tests, TRAC-BD1 provides the best overall results. However, the TRAC-BD1 interfacial shear package for the countercurrent annular flow regime needs further improvement for better prediction of CCFL phenomenon. 47 refs., 87 figs., 15 tabs.

  6. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semianalytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection, designed to assist state and local technical staff with the task of Wellhead Protection Area (WHPA) delineation. A complete news item appeared in Eos, May 1, 1990, p. 690.The model consists of four independent, semianalytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  7. WHPA Code available

    NASA Astrophysics Data System (ADS)

    The Wellhead Protection Area (WHPA) code is now available for distribution by the International Ground Water Modeling Center in Indianapolis, Ind. The WHPA code is a modular, semi-analytical, groundwater flow model developed for the U.S. Environmental Protection Agency, Office of Ground Water Protection. It is designed to assist state and local technical staff with the task of WHPA delineation.The model consists of four independent, semi-analytical modules that may be used to identify the areal extent of groundwater contribution to one or multiple pumping wells. One module is a general particle tracking program that may be used as a post-processor for two-dimensional, numerical models of groundwater flow. One module incorporates a Monte Carlo approach to investigate the effects of uncertain input parameters on capture zones. Multiple pumping and injection wells may be present and barrier or stream boundary conditions may be investigated.

  8. Code-Switching in Japanese Language Classrooms: An Exploratory Investigation of Native vs. Non-Native Speaker Teacher Practice

    ERIC Educational Resources Information Center

    Hobbs, Valerie; Matsuo, Ayumi; Payne, Mark

    2010-01-01

    Research on language classroom code-switching ranges from describing both teachers' and learners' first language and target language use to making connections between code-switching and student learning. However, few studies compare differences in practice between native and non-native speaker teachers and even fewer consider culture of learning…

  9. Evaluation of the rodent Hershberger bioassay: testing of coded chemicals and supplementary molecular-biological and biochemical investigations.

    PubMed

    Freyberger, A; Ellinger-Ziegelbauer, H; Krötlinger, F

    2007-09-24

    Under the auspices of the Organization for Economic Cooperation and Development (OECD) the Hershberger assay is being validated as an in vivo screen for compounds with (anti)androgenic potential. We participated in the final activity, the testing of coded chemicals. Test compounds included trenbolone (TREN; 1.5, 40 mg/kg), testosterone propionate (TP; 0.4 mg/kg), flutamide (FLUT; 3mg/kg), linuron (LIN; 10, 100mg/kg), 1,1-bis-(4-chlorophenyl)-2,2-dichloroethylene (p,p'-DDE; 16, 160 mg/kg), and two negative reference substances, i.e., compounds not considered to affect androgen-sensitive tissue weights (ASTWs) in the Hershberger assay, namely 4-nonylphenol (NP; 160 mg/kg) and 2,4-dinitrophenol (DNP; 10mg/kg); TREN, LIN, p,p'-DDE, NP, and DNP being used under code. Compounds were administered for 10 days by oral intubation or subcutaneous injection (TP). Additional investigations not mandatorily requested by OECD included organ gravimetry of the liver, gene expression analysis in prostate using quantitative RT PCR for prostate specific binding protein polypeptide C3 (PBPC3) and ornithine decarboxylase 1 (ODC1) and determination of testosterone metabolizing and phase II conjugating enzymes in the liver. After submission of all study reports to OECD by participants uncoding revealed the following results: (A) When assessing androgenic potential in castrated rats, administration of TREN increased the weights of ventral prostate (VP), seminal vesicles (SV), glans penis, levator ani and bulbocavernosus muscles, and Cowper's glands at the high dose. A similar or stronger (VP, SV) increase of ASTWs was observed for TP; NP and DNP were ineffective. TREN dose-dependently increased gene expression of ODC1 and PBPC3, TP induced expression of these genes even more strongly (almost) to the level of untreated intact animals, whereas NP and DNP were inactive. Liver enzyme activities depending on physiological androgen levels were lower in castrated than in intact rats and could not be restored by androgen treatment. (B) When assessing antiandrogenic potential in TP-supplemented castrated rats, administration of LIN and p,p'-DDE decreased ASTWs only at the high dose. FLUT even more effectively decreased ASTWs, NP and DNP were again without effect. Decreases in androgen-responsive gene expression in the prostate corresponding to the organ weight changes were only observed for p,p'-DDE (high dose) and flutamide (PBPC3 only). p,p'-DDE dose-dependently induced liver weights and most liver enzyme activities including androgen-dependent ones. Our study accurately reproduced ASTW changes obtained in previous studies also under code suggesting that the Hershberger assay is a robust tool to screen for an (anti)androgenic potential. Assessment of ODC1 and PBPC3 gene expression in prostate, however, may only represent a sensitive tool for the detection of an androgenic potential. Finally, p,p'-DDE may affect ASTWs by several mechanisms including enhanced testosterone metabolism. PMID:17688994

  10. Edge Detection Using Sparse Coding Method

    Microsoft Academic Search

    Yang Yan; Kang Gewen; Li Hong

    2009-01-01

    Sparse coding is a method for finding a neural network representation of multidimensional data in which each of the components of the representation is rarely ignorantly active at the same time. The representation is closely related to Independent Component Analysis (ICA). In this paper, we introduced the basic principle of ICA and have investigated the capabilities of ICA in the

  11. Polyphasic Study of the Spatial Distribution of Microorganisms in Mexican Pozol, a Fermented Maize Dough, Demonstrates the Need for Cultivation-Independent Methods To Investigate Traditional Fermentations

    PubMed Central

    Ampe, Frédéric; ben Omar, Nabil; Moizan, Claire; Wacher, Carmen; Guyot, Jean-Pierre

    1999-01-01

    The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polyphasic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligonucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our results demonstrate that DGGE fingerprinting and rRNA quantification should allow workers to precisely and rapidly characterize the microbial assemblage in a spontaneous lactic acid fermented food. Lactic acid bacteria (LAB) accounted for 90 to 97% of the total active microflora; no streptococci were isolated, although members of the genus Streptococcus accounted for 25 to 50% of the microflora. Lactobacillus plantarum and Lactobacillus fermentum, together with members of the genera Leuconostoc and Weissella, were the other dominant organisms. The overall activity was more important at the periphery of a ball, where eucaryotes, enterobacteria, and bacterial exopolysacharide producers developed. Our results also showed that the metabolism of heterofermentative LAB was influenced in situ by the distribution of the LAB in the pozol ball, whereas homolactic fermentation was controlled primarily by sugar limitation. We propose that starch is first degraded by amylases from LAB and that the resulting sugars, together with the lactate produced, allow a secondary flora to develop in the presence of oxygen. Our results strongly suggest that cultivation-independent methods should be used to study traditional fermented foods. PMID:10584005

  12. Environmental health and safety independent investigation of the in situ vitrification melt expulsion at the Oak Ridge National Laboratory, Oak Ridge, Tennessee

    SciTech Connect

    NONE

    1996-07-01

    At about 6:12 pm, EDT on April 21, 1996, steam and molten material were expelled from Pit 1 in situ vitrification (ISV) project at the Oak Ridge National Laboratory (ORNL). At the request of the director of the Environmental Restoration (ER) Division, Department of Energy Oak Ridge Operations (DOE ORO), an independent investigation team was established on April 26, 1996. This team was tasked to determine the facts related to the ORNL Pit 1 melt expulsion event (MEE) in the areas of environment safety and health concerns such as the adequacy of the ISV safety systems; operational control restrictions; emergency response planning/execution; and readiness review, and report the investigation team findings within 45 days from the date of incident. These requirements were stated in the letter of appointment presented in Appendix A of this report. This investigation did not address the physical causes of the MEE. A separate investigation was conducted by ISV project personnel to determine the causes of the melt expulsion and the extent of the effects of this phenomenon. In response to this event, occurrence report ORO-LMES-X10ENVRES-1996-0006 (Appendix B) was filed. The investigation team did not address the occurrence reporting or event notification process. The project personnel (project team) examined the physical evidence at Pit 1 ISV site (e.g., the ejected melt material and the ISV hood), reviewed documents such as the site- specific health and safety plan (HASP), and interviewed personnel involved in the event and/or the project. A listing of the personnel interviewed and evidence reviewed is provided in Appendix C.

  13. Further results on product codes

    Microsoft Academic Search

    W. Gore

    1970-01-01

    A new class of codes, called product generator codes, which are similar to Elias's iterated codes, are investigated. An important subclass of these codes is the generalized Reed-Muller codes. If the original codes that are iterated to produce a product code areL_1andL_2-step orthogonalizable, then the product code is (L_1 + L_2 - 1)-step orthogonalizable. Further if anuth-order product generator code

  14. Quantum convolutional stabilizer codes

    E-print Network

    Chinthamani, Neelima

    2004-09-30

    of the state of the other blocks. The code is decoded by ¯rst measuring the error syndromes of the en- coded blocks, and then applying a necessary unitary transformation to the corresponding erroneous qubits. The unitary transformations are independent... of the ones applied to the other blocks. Since there is only a ¯nite number of error syndromes and hence only a ¯nite number of recovery operations per block, decoding a quantum block code requires only ¯nite number of operations. Besides block codes...

  15. Genetic Investigation of MHC-Independent Missing-Self Recognition by Mouse NK Cells Using an In Vivo Bone Marrow Transplantation Model.

    PubMed

    Chen, Peter; Aguilar, Oscar A; Rahim, Mir Munir A; Allan, David S J; Fine, Jason H; Kirkham, Christina L; Ma, Jaehun; Tanaka, Miho; Tu, Megan M; Wight, Andrew; Kartsogiannis, Vicky; Gillespie, Matthew T; Makrigiannis, Andrew P; Carlyle, James R

    2015-03-15

    MHC-I-specific receptors play a vital role in NK cell-mediated "missing-self" recognition, which contributes to NK cell activation. In contrast, MHC-independent NK recognition mechanisms are less well characterized. In this study, we investigated the role of NKR-P1B:Clr-b (Klrb1:Clec2d) interactions in determining the outcome of murine hematopoietic cell transplantation in vivo. Using a competitive transplant assay, we show that Clr-b(-/-) bone marrow (BM) cells were selectively rejected by wild-type B6 recipients, to a similar extent as H-2D(b-/-) MHC-I-deficient BM cells. Selective rejection of Clr-b(-/-) BM cells was mitigated by NK depletion of recipient mice. Competitive rejection of Clr-b(-/-) BM cells also occurred in allogeneic transplant recipients, where it was reversed by selective depletion of NKR-P1B(hi) NK cells, leaving the remaining NKR-P1B(lo) NK subset and MHC-I-dependent missing-self recognition intact. Moreover, competitive rejection of Clr-b(-/-) hematopoietic cells was abrogated in Nkrp1b-deficient recipients, which lack the receptor for Clr-b. Of interest, similar to MHC-I-deficient NK cells, Clr-b(-/-) NK cells were hyporesponsive to both NK1.1 (NKR-P1C)-stimulated and IL-12/18 cytokine-primed IFN-? production. These findings support a unique and nonredundant role for NKR-P1B:Clr-b interactions in missing-self recognition of normal hematopoietic cells and suggest that optimal BM transplant success relies on MHC-independent tolerance mechanisms. These findings provide a model for human NKR-P1A:LLT1 (KLRB1:CLEC2D) interactions in human hematopoietic cell transplants. PMID:25681346

  16. Independent Crossings and Independent Sets

    Microsoft Academic Search

    Paul Wenger

    A cluster in a drawing of a graph in the plane is the set of four endpoints of the two edges involved in a crossing. A independent drawing is a drawing in which the clusters are pairwise disjoint. Albertson (1) asked for the maximum k such that every independent drawing with k crossings has an independent set consisting of one

  17. Validation and application of the WABE code: Investigations of constitutive laws and 2D effects on debris coolability

    Microsoft Academic Search

    Manfred Bürger; Michael Buck; Werner Schmidt; Walter Widmann

    2006-01-01

    The WABE-2D model aims at the problem of coolability of degraded core material during a severe accident in a light water reactor (LWR) and describes the transient boil-off and quenching behavior of debris beds. It is being developed in the frame of the KESS code system, which is considered to describe the processes of core heatup, melting, degradation and relocation

  18. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  19. Evaluation of the rodent Hershberger bioassay on intact juvenile males—Testing of coded chemicals and supplementary biochemical investigations

    Microsoft Academic Search

    A. Freyberger; L. Schladt

    2009-01-01

    Under the auspices of the Organization for Economic Cooperation and Development (OECD) the Hershberger assay on juvenile intact male rats is being validated as a screen for compounds with anti-androgenic potential. We participated in the testing of coded chemicals. Compounds included the positive control flutamide (FLUT, 3mg\\/kg), linuron (LIN, 10, 100mg\\/kg), p,p?-DDE (16, 160mg\\/kg), and two negative substances, 4-nonylphenol (NP,

  20. Investigating the influence of magnetic fields upon structure formation with AMIGA - a C code for cosmological magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Doumler, Timur; Knebe, Alexander

    2010-03-01

    Despite greatly improved observational methods, the presence of magnetic fields at cosmological scales and their role in the process of large-scale structure formation still remain unclear. In this paper, we want to address the question how the presence of a hypothetical primordial magnetic field on large scales influences the cosmic structure formation in numerical simulations. As a tool for carrying out such simulations, we present our new numerical code AMIGA. It combines an N-body code with an Eulerian grid-based solver for the full set of magnetohydrodynamics (MHD) equations in order to conduct simulations of dark matter, baryons and magnetic fields in a self-consistent way in a fully cosmological setting. Our numerical scheme includes effective methods to ensure proper capturing of shocks and highly supersonic flows and a divergence-free magnetic field. The high accuracy of the code is demonstrated by a number of numerical tests. We then present a series of cosmological MHD simulations and confirm that, in order to have a significant effect on the distribution of matter on large scales, the primordial magnetic field strength would have to be significantly higher than the current observational and theoretical constraints.

  1. Individualized Independent

    E-print Network

    Dennehy, John

    African American Literature City ENGL 310 Independent Study (Apollo Curriculum Project) City ENGL 312 AFST 205 African American Politics of Social Change Her Words: My time at CUNY BA was invaluable. BeingIndividualized Studies for Independent Minds His Faculty Mentor: Prof. Sherrie Baver, Political

  2. Evaluation of the rodent Hershberger bioassay: Testing of coded chemicals and supplementary molecular-biological and biochemical investigations

    Microsoft Academic Search

    A. Freyberger; H. Ellinger-Ziegelbauer; F. Krötlinger

    2007-01-01

    Under the auspices of the Organization for Economic Cooperation and Development (OECD) the Hershberger assay is being validated as an in vivo screen for compounds with (anti)androgenic potential. We participated in the final activity, the testing of coded chemicals. Test compounds included trenbolone (TREN; 1.5, 40mg\\/kg), testosterone propionate (TP; 0.4mg\\/kg), flutamide (FLUT; 3mg\\/kg), linuron (LIN; 10, 100mg\\/kg), 1,1-bis-(4-chlorophenyl)-2,2-dichloroethylene (p,p?-DDE; 16,

  3. DNA codes

    SciTech Connect

    Torney, D. C. (David C.)

    2001-01-01

    We have begun to characterize a variety of codes, motivated by potential implementation as (quaternary) DNA n-sequences, with letters denoted A, C The first codes we studied are the most reminiscent of conventional group codes. For these codes, Hamming similarity was generalized so that the score for matched letters takes more than one value, depending upon which letters are matched [2]. These codes consist of n-sequences satisfying an upper bound on the similarities, summed over the letter positions, of distinct codewords. We chose similarity 2 for matches of letters A and T and 3 for matches of the letters C and G, providing a rough approximation to double-strand bond energies in DNA. An inherent novelty of DNA codes is 'reverse complementation'. The latter may be defined, as follows, not only for alphabets of size four, but, more generally, for any even-size alphabet. All that is required is a matching of the letters of the alphabet: a partition into pairs. Then, the reverse complement of a codeword is obtained by reversing the order of its letters and replacing each letter by its match. For DNA, the matching is AT/CG because these are the Watson-Crick bonding pairs. Reversal arises because two DNA sequences form a double strand with opposite relative orientations. Thus, as will be described in detail, because in vitro decoding involves the formation of double-stranded DNA from two codewords, it is reasonable to assume - for universal applicability - that the reverse complement of any codeword is also a codeword. In particular, self-reverse complementary codewords are expressly forbidden in reverse-complement codes. Thus, an appropriate distance between all pairs of codewords must, when large, effectively prohibit binding between the respective codewords: to form a double strand. Only reverse-complement pairs of codewords should be able to bind. For most applications, a DNA code is to be bi-partitioned, such that the reverse-complementary pairs are separated across the two blocks. For the foregoing reasons, these two blocks of codewords suffice as the hooks and loops of a digital Velcro. We began our investigations of such codes by constructing quaternary BCH reverse-complement codes, using cyclic codes and conventional Hamming distance [4]. We also obtained upper and lower bounds on the rate of reverse-complement codes with a metric function based on the foregoing similarities [3]. For most applications involving DNA, however, the reverse-complementary analogue of codes based on the insertion-deletion distance is more advantageous. This distance equals the codeword length minus the longest length of a common (not necessarily contiguous) subsequence. (The 'aligned' codes described above may be used under special experimental conditions), The advantage arises because, under the assumption that DNA is very flexible, the sharing of sufficiently long subsequences between codewords would be tantamount to the ability of one of their reverse complements to form a double strand with the other codeword. Thus far, using the random coding method, we have derived an asymptotic lower bound on the rate of reverse-complement insertion-deletion codes, as a function of the insertion-deletion distance fraction and of the alphabet size [1]. For the quaternary DNA alphabet of primary importance, this lower bound yields an asymptotically positive rate if the insertion-deletion-distance fraction does not exceed the threshold {approx} 0.19. Extensions of the Varsamov-Tenengol'ts construction of insertion-deletion codes [5] for reverse-complement insertion-deletion codes will be described. Experiments have been performed involving some of our DNA codes.

  4. Parallelization of the SIR code for the investigation of small-scale features in the solar photosphere

    E-print Network

    Thonhofer, Stefan; Utz, Dominik; Hanslmeier, Arnold; Jur?ák, Jan

    2015-01-01

    Magnetic fields are one of the most important drivers of the highly dynamic processes that occur in the lower solar atmosphere. They span a broad range of sizes, from large- and intermediate-scale structures such as sunspots, pores and magnetic knots, down to the smallest magnetic elements observable with current telescopes. On small scales, magnetic flux tubes are often visible as Magnetic Bright Points (MBPs). Apart from simple $V/I$ magnetograms, the most common method to deduce their magnetic properties is the inversion of spectropolarimetric data. Here we employ the SIR code for that purpose. SIR is a well-established tool that can derive not only the magnetic field vector and other atmospheric parameters (e.g., temperature, line-of-sight velocity), but also their stratifications with height, effectively producing 3-dimensional models of the lower solar atmosphere. In order to enhance the runtime performance and the usability of SIR we parallelized the existing code and standardized the input and output ...

  5. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the coding techniques are equally applicable to any voice signal whether or not it carries any intelligible information, as the term speech implies. Other terms that are commonly used are speech compression and voice compression since the fundamental idea behind speech coding is to reduce (compress) the transmission rate (or equivalently the bandwidth) And/or reduce storage requirements In this document the terms speech and voice shall be used interchangeably.

  6. Unfolding the color code

    E-print Network

    Aleksander Kubica; Beni Yoshida; Fernando Pastawski

    2015-03-06

    The topological color code and the toric code are two leading candidates for realizing fault-tolerant quantum computation. Here we show that the color code on a $d$-dimensional closed manifold is equivalent to multiple decoupled copies of the $d$-dimensional toric code up to local unitary transformations and adding or removing ancilla qubits. Our result not only generalizes the proven equivalence for $d=2$, but also provides an explicit recipe of how to decouple independent components of the color code, highlighting the importance of colorability in the construction of the code. Moreover, for the $d$-dimensional color code with $d+1$ boundaries of $d+1$ distinct colors, we find that the code is equivalent to multiple copies of the $d$-dimensional toric code which are attached along a $(d-1)$-dimensional boundary. In particular, for $d=2$, we show that the (triangular) color code with boundaries is equivalent to the (folded) toric code with boundaries. We also find that the $d$-dimensional toric code admits logical non-Pauli gates from the $d$-th level of the Clifford hierarchy, and thus saturates the bound by Bravyi and K\\"{o}nig. In particular, we show that the $d$-qubit control-$Z$ logical gate can be fault-tolerantly implemented on the stack of $d$ copies of the toric code by a local unitary transformation.

  7. Investigations into Resting-state Connectivity using Independent Component Analysis FMRIB Technical Report TR05CB1 (A related paper has been accepted for publication in Philosophical Transactions of the Royal Society, Special Issue on 'Multimodal neuroimaging of brain connectivity')

    Microsoft Academic Search

    Christian F. Beckmann; Marilena DeLuca; Joseph T. Devlin; Stephen M. Smith

    Inferring resting-state connectivity patterns from functional magnetic resonance imaging (FMRI) data is a challenging task for any analytical technique. In this paper we review a probabilistic independent component analysis (PICA) approach, optimised for the analysis of FMRI data (Beckmann and Smith, 2004), and discuss the role which this exploratory technique can take in scientific investigations into the structure of these

  8. A gene–environment investigation on personality traits in two independent clinical sets of adult patients with personality disorder and attention deficit\\/hyperactive disorder

    Microsoft Academic Search

    Christian P. Jacob; Thuy Trang Nguyen; Astrid Dempfle; Monika Heine; Christine Windemuth-Kieselbach; Katarina Baumann; Florian Jacob; Julian Prechtl; Maike Wittlich; Martin J. Herrmann; Silke Gross-Lesch; Klaus-Peter Lesch; Andreas Reif

    2010-01-01

    While an interactive effect of genes with adverse life events is increasingly appreciated in current concepts of depression\\u000a etiology, no data are presently available on interactions between genetic and environmental (G × E) factors with respect to\\u000a personality and related disorders. The present study therefore aimed to detect main effects as well as interactions of serotonergic\\u000a candidate genes (coding for the serotonin

  9. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  10. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  11. Application of a multi-block CFD code to investigate the impact of geometry modeling on centrifugal compressor flow field predictions

    SciTech Connect

    Hathaway, M.D. [Vehicle Technology Center, Cleveland, OH (United States); Wood, J.R. [NASA Lewis Research Center, Cleveland, OH (United States)

    1997-10-01

    CFD codes capable of utilizing multi-block grids provide the capability to analyze the complete geometry of centrifugal compressors. Attendant with this increased capability is potentially increased grid setup time and more computational overhead with the resultant increase in wall clock time to obtain a solution. If the increase in difficulty of obtaining a solution significantly improves the solution from that obtained by modeling the features of the tip clearance flow or the typical bluntness of a centrifugal compressor`s trailing edge, then the additional burden is worthwhile. However, if the additional information obtained is of marginal use, then modeling of certain features of the geometry may provide reasonable solutions for designers to make comparative choices when pursuing a new design. In this spirit a sequence of grids were generated to study the relative importance of modeling versus detailed gridding of the tip gap and blunt trailing edge regions of the NASA large low-speed centrifugal compressor for which there is considerable detailed internal laser anemometry data available for comparison. The results indicate: (1) There is no significant difference in predicted tip clearance mass flow rate whether the tip gap is gridded or modeled. (2) Gridding rather than modeling the trailing edge results in better predictions of some flow details downstream of the impeller, but otherwise appears to offer no great benefits. (3) The pitchwise variation of absolute flow angle decreases rapidly up to 8% impeller radius ratio and much more slowly thereafter. Although some improvements in prediction of flow field details are realized as a result of analyzing the actual geometry there is no clear consensus that any of the grids investigated produced superior results in every case when compared to the measurements. However, if a multi-block code is available, it should be used, as it has the propensity for enabling better predictions than a single block code.

  12. In search for the relevant parameters for speaker independent speech recognition

    Microsoft Academic Search

    Johan Smolders; Dirk Van Compernolle

    1993-01-01

    One of the problems with speaker-independent speech recognition is the huge amount of training data required, which implies a high cost. The performance of a discrete density hidden-Markov-model speaker-independent speech recognition system when using a small set of examples for training is investigated. By using LPC (linear prediction coding)-based analysis, an approximately 12% error rate was obtained on a highly

  13. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    SciTech Connect

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  14. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    NASA Astrophysics Data System (ADS)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  15. Investigation of the effective connectivity of resting state networks in Alzheimer's disease: a functional MRI study combining independent components analysis and multivariate Granger causality analysis.

    PubMed

    Liu, Zhenyu; Zhang, Yumei; Bai, Lijun; Yan, Hao; Dai, Ruwei; Zhong, Chongguang; Wang, Hu; Wei, Wenjuan; Xue, Ting; Feng, Yuanyuan; You, Youbo; Tian, Jie

    2012-12-01

    Recent neuroimaging studies have shown that the cognitive and memory decline in patients with Alzheimer's disease (AD) is coupled with abnormal functions of focal brain regions and disrupted functional connectivity between distinct brain regions, as well as losses in small-world attributes. However, the causal interactions among the spatially isolated, but functionally related, resting state networks (RSNs) are still largely unexplored. In this study, we first identified eight RSNs by independent components analysis from resting state functional MRI data of 18 patients with AD and 18 age-matched healthy subjects. We then performed a multivariate Granger causality analysis (mGCA) to evaluate the effective connectivity among the RSNs. We found that patients with AD exhibited decreased causal interactions among the RSNs in both intensity and quantity relative to normal controls. Results from mGCA indicated that the causal interactions involving the default mode network and auditory network were weaker in patients with AD, whereas stronger causal connectivity emerged in relation to the memory network and executive control network. Our findings suggest that the default mode network plays a less important role in patients with AD. Increased causal connectivity of the memory network and self-referential network may elucidate the dysfunctional and compensatory processes in the brain networks of patients with AD. These preliminary findings may provide a new pathway towards the determination of the neurophysiological mechanisms of AD. PMID:22505275

  16. Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition

    NASA Technical Reports Server (NTRS)

    Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.

    2003-01-01

    The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

  17. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  18. Residential area deprivation predicts fruit and vegetable consumption independently of individual educational level and occupational social class: a cross sectional population study in the Norfolk cohort of the European Prospective Investigation into Cancer (EPIC-Norfolk)

    PubMed Central

    Shohaimi, S.; Welch, A.; Bingham, S.; Luben, R.; Day, N.; Wareham, N.; Khaw, K.

    2004-01-01

    Study objective: To investigate the independent association between individual and area based socioeconomic measures and fruit and vegetable consumption. Design: Cross sectional population based study. Setting and participants: 22 562 men and women aged 39–79 years living in the general community in Norfolk, United Kingdom, recruited using general practice age-sex registers. Outcome measures: Fruit and vegetable intake assessed using a food frequency questionnaire. Main results: Being in a manual occupational social class, having no educational qualifications, and living in a deprived area all independently predicted significantly lower consumption of fruit and vegetables. The effect of residential area deprivation was predominantly in those in manual occupational social class and no educational qualifications. Conclusions: Understanding some of the community level barriers to changing health related behaviours may lead to more effective interventions to improving health in the whole community, particularly those who are most vulnerable. PMID:15252072

  19. Arterial compliance and endothelium-dependent vasodilation are independently related to coronary risk in the elderly: the Prospective Investigation of the Vasculature in Uppsala Seniors (PIVUS) study

    Microsoft Academic Search

    Lars Lind

    2008-01-01

    Summary Background: Measurements of both arterial compliance and endothelium-dependent vasodilation have previously been related to coronary risk factors, but not in the same study. In the Prospective Investigation of the Vasculature in Uppsala Seniors (PIVUS) study, we studied the interplay between arterial compliance and endothelium-dependent vasodilation on coronary risk. Methods: In the population-based PIVUS study (1016 subjects aged 70 years),

  20. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  1. Permutation codes for sources.

    NASA Technical Reports Server (NTRS)

    Berger, T.; Jelinek, F.; Wolf, J. K.

    1972-01-01

    Source encoding techniques based on permutation codes are investigated. For a broad class of distortion measures it is shown that optimum encoding of a source permutation code is easy to instrument even for very long block lengths. Also, the nonparametric nature of permutation encoding is well suited to situations involving unknown source statistics. For the squared-error distortion measure a procedure for generating good permutation codes of a given rate and block length is described. The performance of such codes for a memoryless Gaussian source is compared both with the rate-distortion function bound and with the performance of various quantization schemes. The comparison reveals that permutation codes are asymptotically ideal for small rates and perform as well as the best entropy-coded quantizers presently known for intermediate rates. They can be made to compare favorably at high rates, too, provided the coding delay associated with extremely long block lengths is tolerable.

  2. Proceedings of the SMBE Tri-National Young Investigators' Workshop 2005. Relaxation of functional constraint on light-independent protochlorophyllide oxidoreductase in Thuja.

    PubMed

    Kusumi, Junko; Sato, Aya; Tachida, Hidenori

    2006-05-01

    The light-independent protochlorophyllide oxidoreductase (DPOR) plays a key role in the ability of nonflowering plants and algae to synthesize chlorophyll in darkness. This enzyme consists of three subunits encoded by the chlB, chlL, and chlN genes in the plastid genome. Previously, we found a high nonsynonymous substitution rate (dN) of the chlL gene in the lineage of Thuja standishii, a conifer belonging to the Cupressaceae. Here we revealed that the acceleration of dN in the chlL occurred as well in other species of Thuja, Thuja occidentalis and Thuja plicata. In addition, dark-grown seedlings of T. occidentalis were found to exhibit a pale yellowish color, and their chlorophyll concentration was much lower than that of other species of Cupressaceae. The results suggested that the species of Thuja have lost the ability to synthesize chlorophyll in darkness, and the functional constraint on the DPOR would thus be expected to be relaxed in this genus. Therefore, we expected to find that the evolutionary rates of all subunits of DPOR would in this case be accelerated. Sequence analyses of the chlN and chlB (encoding the other subunits of DPOR) in 18 species of Cupressaceae revealed that the dN of the chlN gene was accelerated in Thuja as was the dN of the chlL gene, but the dN of the chlB gene did not appear to differ significantly among the species of Cupressaceae. Sequencing of reverse transcription-polymerase chain reaction (RT-PCR) products of these genes showed that RNA editing was rare and unlikely to have contributed to the acceleration. Moreover, the RT-PCR analysis indicated that all chl genes were still transcriptionally active in T. occidentalis. Based on these results, it appears that species of Thuja still bear the DPOR protein, although the enzyme has lost its activity because of nonsynonymous mutations of some of the chl genes. The lack of acceleration of the dN of the chlB gene might be accounted for by various unknown functions of its gene product. PMID:16428257

  3. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    PubMed Central

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  4. Toward Independence or Unification?

    Microsoft Academic Search

    Wen-Chun Chang

    2008-01-01

    This study investigates the relationships between subjective well-being and partisanship for people in Taiwan where voters' political ideologies are largely influenced by their positions toward their country's relations with China. It is found that voters preferring a declaration of independence for Taiwan are more likely to be supporters of Pan-Green political parties (i.e. the Democratic Progressive Party (DPP) and the

  5. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  6. Regularized Robust Coding for Face Recognition

    E-print Network

    Meng, Yang; Jian, Yang; Zhang, David

    2012-01-01

    Recently the sparse representation based classification (SRC) has been proposed for robust face recognition (FR). In SRC, the testing image is coded as a sparse linear combination of the training samples, and the representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Such a sparse coding model assumes that the coding residual follows Gaussian or Laplacian distribution, which may not be effective enough to describe the coding residual in practical FR systems. Meanwhile, the sparsity constraint on the coding coefficients makes SRC's computational cost very high. In this paper, we propose a new face coding model, namely regularized robust coding (RRC), which could robustly regress a given signal with regularized regression coefficients. By assuming that the coding residual and the coding coefficient are respectively independent and identically distributed, the RRC seeks for a maximum a posterior solution of the coding problem. An iteratively reweighted regularized robust coding (IR...

  7. Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J. (Nuclear Engineering Division)

    2012-05-10

    Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the calculations reveal that the compressor conditions are calculated to approach surge such that the need for a surge control system for each compressor is identified. Thus, it is demonstrated that the S-CO{sub 2} cycle can operate in the initial decay heat removal mode even with autonomous reactor control. Because external power is not needed to drive the compressors, the results show that the S-CO{sub 2} cycle can be used for initial decay heat removal for a lengthy interval in time in the absence of any off-site electrical power. The turbine provides sufficient power to drive the compressors. Combined with autonomous reactor control, this represents a significant safety advantage of the S-CO{sub 2} cycle by maintaining removal of the reactor power until the core decay heat falls to levels well below those for which the passive decay heat removal system is designed. The new control strategy is an alternative to a split-shaft layout involving separate power and compressor turbines which had previously been identified as a promising approach enabling heat removal from a SFR at low power levels. The current results indicate that the split-shaft configuration does not provide any significant benefits for the S-CO{sub 2} cycle over the current single-shaft layout with shaft speed control. It has been demonstrated that when connected to the grid the single-shaft cycle can effectively follow the load over the entire range. No compressor speed variation is needed while power is delivered to the grid. When the system is disconnected from the grid, the shaft speed can be changed as effectively as it would be with the split-shaft arrangement. In the split-shaft configuration, zero generator power means disconnection of the power turbine, such that the resulting system will be almost identical to the single-shaft arrangement. Without this advantage of the split-shaft configuration, the economic benefits of the single-shaft arrangement, provided by just one turbine and lower losses at the design point, are more important to the overall cycle performance. Therefore, the single-shaft

  8. The Performances of Convolutional Codes used in Turbo Codes

    Microsoft Academic Search

    Maria Kovaci

    2004-01-01

    In this paper are presented and compared the BER performances obtained by the simulation of a transmission system, which utilizes the forward error correcting by codes concatenation and iterative decoding (turbo coding). There have been investigated all the systematic convolutional codes having the constraint length K less or equal to 6, under three diffrent concatenated forms: parallel PCCC (pure turbo

  9. Analysis of distortion in pulse-code modulation systems

    Microsoft Academic Search

    J. P. Schouten; H. W. F. Van Tgroenewout

    1952-01-01

    Summary  After a short introduction about the time division principle (horizontal quantization) an investigation is made of the distortion\\u000a caused by quantization of the amplitude of the input signal (vertical quantization) in a pulse-code modulation system. A general\\u000a analysis is given, independent of the way in which a quantization level is chosen and for arbitrary values of sampling frequency\\u000a versus signal

  10. The Complexity of the Covering Radius Problem on Lattices and Codes

    Microsoft Academic Search

    Venkatesan GuruswamiDaniele

    We initiate the study of the computational complexity of the covering radius problem for point lat- tices, and approximation versions of the problem for both lattices and linear codes. We also investigate the computational complexity of the shortest linearly independent vectors problem, and its relation to the covering radius problem for lattices. For the covering radius on n-dimensional lattices, we

  11. How do we code the letters of a word when we have to write it? Investigating double letter representation in French.

    PubMed

    Kandel, Sonia; Peereman, Ronald; Ghimenton, Anna

    2014-05-01

    How do we code the letters of a word when we have to write it? We examined whether the orthographic representations that the writing system activates have a specific coding for letters when these are doubled in a word. French participants wrote words on a digitizer. The word pairs shared the initial letters and differed on the presence of a double letter (e.g., LISSER/LISTER). The results on latencies, letter and inter-letter interval durations revealed that L and I are slower to write when followed by a doublet (SS) than when not (ST). Doublet processing constitutes a supplementary cognitive load that delays word production. This suggests that word representations code letter identity and quantity separately. The data also revealed that the central processes that are involved in spelling representation cascade into the peripheral processes that regulate movement execution. PMID:24486807

  12. Getting Students to be Successful, Independent Investigators

    NSDL National Science Digital Library

    Jeffrey D. Thomas

    2010-02-01

    Middle school students often struggle when writing testable problems, planning valid and reliable procedures, and drawing meaningful evidence-based conclusions. To address this issue, the author created a student-centered lab handout to facilitate the inq

  13. Getting Students to be Successful, Independent Investigators

    ERIC Educational Resources Information Center

    Thomas, Jeffrey D.

    2010-01-01

    Middle school students often struggle when writing testable problems, planning valid and reliable procedures, and drawing meaningful evidence-based conclusions. To address this issue, the author created a student-centered lab handout to facilitate the inquiry process for students. This handout has reduced students' frustration and helped them…

  14. BYU Independent Study Petition Use Black or Blue Ink Only BYU Independent Study

    E-print Network

    Olsen Jr., Dan R.

    BYU Independent Study Petition Use Black or Blue Ink Only BYU Independent Study 110 MORC Provo UT: STUDENT INFORMATION Full Name (Last, First, Middle) BYU Net ID Phone Mailing Address City State Zip Code Other Reason for Request: Medical/Clinical Academic Requirement Personal/Family Other STEP 4: STUDENT

  15. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  16. Phonological coding during reading.

    PubMed

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. PMID:25150679

  17. Character coding

    NSDL National Science Digital Library

    Jeff Wilson

    Character coding has been called the bete noire of phylogenetic analysis. As you may have seen from class, the definition of "character" is squishy and varies between authors. Although there isn't agreement on exactly what a character is, it is possible to predict how certain character definitions and coding strategies affect phylogenetic analysis. This activity focuses on character coding, specifically about how different coding strategies can affect analysis. In this exercise we will try to look at different coding strategies by considering the simple shapes below. (1) What is a character, and what qualities do characters have? (2) Given the 'morphology' depicted above, what features vary? (3) Given the variation you identified, come up with as many character codings as you can; i.e., different ways that this variation can be coded into characters. (4) For each of the coding strategies you come up with in question 3, identify its assumptions, limitations, and strengths. (5) Identify your preferred coding strategy and defend your choice. Students asked to define what a character is and to discuss what they 'require', and then to come up with an exhaustive list of coding strategies for the sample morphology. They are then asked to list assumptions/limitations of each strategy.

  18. Two-dimensional crosstalk avoidance codes

    Microsoft Academic Search

    Xuebin Wu; Zhiyuan Yan; Yuan Xie

    2008-01-01

    Global buses in deep submicron system-on-chip designs suffer from increasing crosstalk delay as the feature size shrinks. As an technology-independent solution, crosstalk avoidance coding alleviates the problem while requiring less area and power than shielding. Most previously considered crosstalk avoidance codes are one-dimensional, and have limited code rates. In this paper, we propose two-dimensional crosstalk avoidance codes (TDCAC), which achieve

  19. Profile Guided Code Positioning

    Microsoft Academic Search

    Karl Pettis; Robert C. Hansen

    1990-01-01

    This paper presents the results of our investigation of code positioning techniques using execution profile data as input into the compilation process. The primary objective of the positioning is to reduce the overhead of the instruction memory hierarchy. After initial investigation in the literature, we decided to implement two prototypes for the Hewlett-Packard Precision Architecture (PA-RISC). The first, built on

  20. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E. C.

    1986-01-01

    The analysis of the rotational dynamics of the satellite was focused on the rotational amplitude increase of the satellite, with respect to the tether, during retrieval. The dependence of the rotational amplitude upon the tether tension variation to the power 1/4 was thoroughly investigated. The damping of rotational oscillations achievable by reel control was also quantified while an alternative solution that makes use of a lever arm attached with a universal joint to the satellite was proposed. Comparison simulations between the Smithsonian Astrophysical Observatory and the Martin Marietta (MMA) computer code of reteival maneuvers were also carried out. The agreement between the two, completely independent, codes was extremely close, demonstrating the reliability of the models. The slack tether dynamics during reel jams was analytically investigated in order to identify the limits of applicability of the SLACK3 computer code to this particular case. Test runs with SLACK3 were also carried out.

  1. On the Performance of Short Forward Error-Correcting Codes

    Microsoft Academic Search

    Sheng Tong; Dengsheng Lin; Aleksandar Kavcic; Li Ping; Baoming Bai

    2007-01-01

    This letter investigates the performance of short forward error-correcting (FEC) codes. Reed-Solomon (RS) codes and concatenated zigzag codes are chosen as representatives of classical algebraic codes and modern simple iteratively decodable codes, respectively. Additionally, random binary linear codes are used as a baseline reference. Our main results (demonstrated by simulations and ensemble distance spectrum analysis) are as follows: 1) Short

  2. Performance of array codes on Power Line Communications channel

    Microsoft Academic Search

    Nikoleta Andreadou; Fotini-Niovi Pavlidou

    2008-01-01

    In this paper we investigate the performance of array codes and in particular generalised array codes (GAC) and row and column array codes (RAC) as coding schemes in the power line communications (PLC) environment. We apply different code rates and modulation techniques and we examine how these codes perform in the hostile channel of power lines in terms of BER

  3. Concatenated quantum codes in biological systems

    NASA Astrophysics Data System (ADS)

    Lloyd, Seth

    2011-03-01

    This talk investigates how biological systems such as photosynthetic bacteria use quantum coding techniques such as decoherent subspaces, noiseless subsystems, and concatenated quantum codes to engineer long exitonic lifetimes and rapid energy transport. The existence of hierarchical structures in photosynthetic complexes is associated with concatenated quantum codes. A concatenated code is one that combines two or more codes to construct a hierarchical code that possesses features of all its constituent codes. In photosynthetic complexes, structures at the smallest level use quantum coding techniques to enhance exciton lifetimes, and structures at higher scales possess symmetries that enhance exciton hopping rates. The result is a concatenated quantum code that simultaneously protects excitons and enhances their transport rate. All known quantum codes can be described within the framework of group representation theory. This talk reviews the relationship between symmetry and quantum codes, and shows how photosynthetic bacteria and plants put quantum coding techniques to use to improve the efficiency of photosynthetic transport.

  4. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  5. Sharing code

    PubMed Central

    Kubilius, Jonas

    2014-01-01

    Sharing code is becoming increasingly important in the wake of Open Science. In this review I describe and compare two popular code-sharing utilities, GitHub and Open Science Framework (OSF). GitHub is a mature, industry-standard tool but lacks focus towards researchers. In comparison, OSF offers a one-stop solution for researchers but a lot of functionality is still under development. I conclude by listing alternative lesser-known tools for code and materials sharing. PMID:25165519

  6. Adaptive format conversion information as enhancement data for scalable video coding

    E-print Network

    Wan, Wade K. (Wade Keith), 1973-

    2002-01-01

    Scalable coding techniques can be used to efficiently provide multicast video service and involve transmitting a single independently coded base layer and one or more dependently coded enhancement layers. Clients can decode ...

  7. The Genomic Code for Nucleosome Positioning

    Microsoft Academic Search

    Jonathan Widom; William Deering

    2010-01-01

    Eukaryotic genomes encode an additional layer of genetic information, superimposed on top of the regulatory and coding information that controls the organization of the genomic DNA into arrays of nucleosomes. We have developed an ability to read this nucleosome positioning code and predict the in vivo locations of nucleosomes, using two independent approaches. One approach is based on a statistical

  8. Profile guided code positioning

    Microsoft Academic Search

    Karl Pettis; Robert C. Hansen; Jack W. Davidson

    2004-01-01

    This paper presents the results of our investigation of code positioning techniques using execution profile data as input into the compilation process. The primary objective of the positioning is to reduce the overhead of the instruction memory hierarchy.After initial investigation in the literature, we decided to implement two prototypes for the Hewlett-Packard Precision Architecture (PA-RISC). The first, built on top

  9. Free Code

    NSDL National Science Digital Library

    Free Code, a service of Andover.Net, is a large index of Internet-related software tool source code. The tools are written in C/C++, Perl, Java, or Visual Basic, and are free for personal and commercial use. They range from handy Perl CGI scripts to Java-based graphics packages. Each tool in the index is briefly described, characterized by language and operating system, and linked to both the home page for the tool and the source code. The total lack of documentation for the search engine makes useful queries hard to create, but the tools are still easy-to-find. This is a very useful index for anyone building Internet or Web-based applications.

  10. Nature's Code

    NASA Astrophysics Data System (ADS)

    Hill, Vanessa J.; Rowlands, Peter

    2008-10-01

    We propose that the mathematical structures related to the `universal rewrite system' define a universal process applicable to Nature, which we may describe as `Nature's code'. We draw attention here to such concepts as 4 basic units, 64- and 20-unit structures, symmetry-breaking and 5-fold symmetry, chirality, double 3-dimensionality, the double helix, the Van der Waals force and the harmonic oscillator mechanism, and our explanation of how they necessarily lead to self-aggregation, complexity and emergence in higher-order systems. Biological concepts, such as translation, transcription, replication, the genetic code and the grouping of amino acids appear to be driven by fundamental processes of this kind, and it would seem that the Platonic solids, pentagonal symmetry and Fibonacci numbers have significant roles in organizing `Nature's code'.

  11. CONTAIN independent peer review

    SciTech Connect

    Boyack, B.E. [Los Alamos National Lab., NM (United States); Corradini, M.L. [Univ. of Wisconsin, Madison, WI (United States). Nuclear Engineering Dept.; Denning, R.S. [Battelle Memorial Inst., Columbus, OH (United States); Khatib-Rahbar, M. [Energy Research Inc., Rockville, MD (United States); Loyalka, S.K. [Univ. of Missouri, Columbia, MO (United States); Smith, P.N. [AEA Technology, Dorchester (United Kingdom). Winfrith Technology Center

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  12. Code Cracker

    NSDL National Science Digital Library

    2012-01-27

    Whether it's the genetic code, an ancient language, or patterns of light in a distant galaxy, scientists often have to play the role of decoder. In this activity, learners create a code to send secret messages for other learners to decode. When learners set up a free account at Kinetic City, they can answer bonus questions at the end of the activity as a quick assessment. As a larger assessment, learners can complete the Bug Blaster game after they've completed several activities.

  13. INDEPENDENT STATUS APPEAL Academic Year 20132014 DIRECTIONS--If you do not meet the definition of an Independent student,

    E-print Network

    Amin, S. Massoud

    /12 To ensure privacy online, open in Adobe Reader (free at Adobe.com). Add the required signature(s) in blue, ZIP code) Phone (include area code) SECTION B. Independent status definition The federally-mandated formula used to determine your financial need is based on the premise that your family has the primary

  14. INDEPENDENT STATUS APPEAL Academic Year 20142015 DIRECTIONS--If you do not meet the definition of an Independent student,

    E-print Network

    Amin, S. Massoud

    /14 To ensure privacy online, open in Adobe Reader (free at Adobe.com). Add the required signature(s) in blue, ZIP code) Phone (include area code) SECTION B. Independent status definition The federally-mandated formula used to determine your financial need is based on the premise that your family has the primary

  15. Polarization-independent active metamaterial for high-frequency

    E-print Network

    Polarization-independent active metamaterial for high-frequency terahertz modulation Oliver Paul1-independent metamaterial design for the construction of electrically tunable terahertz (THz) devices. The implemented Optical Society of America OCIS codes: (160.3918) Metamaterials; (230.4110) Modulators; (300.6495) THz

  16. Polarization independent microphotonic circuits

    E-print Network

    Watts, Michael Robert, 1974-

    2005-01-01

    Microphotonic circuits have been proposed for applications ranging from optical switching and routing to optical logic circuits. However many applications require microphotonic circuits to be polarization independent, a ...

  17. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  18. Parallelization of the SIR code

    NASA Astrophysics Data System (ADS)

    Thonhofer, S.; Bellot Rubio, L. R.; Utz, D.; Jur?ak, J.; Hanslmeier, A.; Piantschitsch, I.; Pauritsch, J.; Lemmerer, B.; Guttenbrunner, S.

    A high-resolution 3-dimensional model of the photospheric magnetic field is essential for the investigation of small-scale solar magnetic phenomena. The SIR code is an advanced Stokes-inversion code that deduces physical quantities, e.g. magnetic field vector, temperature, and LOS velocity, from spectropolarimetric data. We extended this code by the capability of directly using large data sets and inverting the pixels in parallel. Due to this parallelization it is now feasible to apply the code directly on extensive data sets. Besides, we included the possibility to use different initial model atmospheres for the inversion, which enhances the quality of the results.

  19. MCNP code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MCNP code is the major Monte Carlo coupled neutron-photon transport research tool at the Los Alamos National Laboratory, and it represents the most extensive Monte Carlo development program in the United States which is available in the public domain. The present code is the direct descendent of the original Monte Carlo work of Fermi, von Neumaum, and Ulam at Los Alamos in the 1940s. Development has continued uninterrupted since that time, and the current version of MCNP (or its predecessors) has always included state-of-the-art methods in the Monte Carlo simulation of radiation transport, basic cross section data, geometry capability, variance reduction, and estimation procedures. The authors of the present code have oriented its development toward general user application. The documentation, though extensive, is presented in a clear and simple manner with many examples, illustrations, and sample problems. In addition to providing the desired results, the output listings give a a wealth of detailed information (some optional) concerning each state of the calculation. The code system is continually updated to take advantage of advances in computer hardware and software, including interactive modes of operation, diagnostic interrupts and restarts, and a variety of graphical and video aids.

  20. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  1. Network coding

    E-print Network

    Lehman, April Rasala, 1977-

    2005-01-01

    In the network coding problem, there are k commodities each with an associated message Mi, a set of sources that know Mi and a set of sinks that request Mi. Each edge in the graph may transmit any function of the messages. ...

  2. Code Crackers

    NSDL National Science Digital Library

    2010-12-03

    This math unit from Illuminations introduces students to the concepts of cryptology and coding. It includes two lessons, which cover the Caesar Cipher and the Vignere Cipher. Students will learn to encode and decode messages using these ciphers. This unit is intended for grades 9-12; each lesson should take one class period to complete.

  3. Use of redundant bits for magnetic recording: single-Parity codes and Reed-Solomon error-correcting code

    Microsoft Academic Search

    Z. A. Keirn; Victor Y. Krachkovsky; Erich F. Haratsch; Harley Burger

    2004-01-01

    The performance of single-parity codes used in conjunction with the Reed-Solomon error-correcting code (ECC) is investigated. Specifically, the tradeoff between simply increasing ECC power instead of using a parity code is explored.

  4. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2011-10-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

  5. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2012-04-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

  6. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  7. Central bank Financial Independence

    Microsoft Academic Search

    J. Ramon Martinez-Resano

    2004-01-01

    Central bank independence is a multifaceted institutional design. The financial component has been seldom analysed. This paper intends to set a comprehensive conceptual background for central bank financial independence. Quite often central banks are modelled as robot like maximizers of some goal. This perspective neglects the fact that central bank functions are inevitably deployed on its balance sheet and have

  8. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  9. Accounting for Independent Schools.

    ERIC Educational Resources Information Center

    Sonenstein, Burton

    The diversity of independent schools in size, function, and mode of operation has resulted in a considerable variety of accounting principles and practices. This lack of uniformity has tended to make understanding, evaluation, and comparison of independent schools' financial statements a difficult and sometimes impossible task. This manual has…

  10. Maximum likelihood decoding analysis of LT codes over AWGN channels

    Microsoft Academic Search

    Xiao Ma; Chunxiao Li; Baoming Bai

    2010-01-01

    Luby-Transform (LT) codes are a class of Fountain codes which can approach the capacity of the erasure channels. In this paper, we investigate the performance of the LT codes over AWGN channels with BPSK modulation. First, the ensemble weight distribution of an LT code is derived. Secondly, we use a refined union bound to analyze the performance of LT codes

  11. On automatic differentiation of codes with COMPLEX arithmetic with respect to real variables

    SciTech Connect

    Pusch, G.D.; Bischof, C. [Argonne National Lab., IL (United States); Carle, A. [Rice Univ., St. Houston, TX (United States)

    1995-06-01

    We explore what it means to apply automatic differentiation with respect to a set of real variables to codes containing complex arithmetic. That is, both dependent and independent variables with respect to differentiation are real variables, but in order to exploit features of complex mathematics, part of the code is expressed by employing complex arithmetic. We investigate how one can apply automatic differentiation to complex variables if one exploits the homomorphism of the complex numbers C onto R{sup 2}. It turns out that, by and large, the usual rules of differentiation apply, but subtle differences in special cases arise for sqrt (), abs (), and the power operator.

  12. Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code

    NASA Technical Reports Server (NTRS)

    Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.

    2003-01-01

    Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

  13. Fuel management optimization using genetic algorithms and code independence

    SciTech Connect

    DeChaine, M.D.; Feltus, M.A.

    1994-12-31

    Fuel management optimization is a hard problem for traditional optimization techniques. Loading pattern optimization is a large combinatorial problem without analytical derivative information. Therefore, methods designed for continuous functions, such as linear programming, do not always work well. Genetic algorithms (GAs) address these problems and, therefore, appear ideal for fuel management optimization. They do not require derivative information and work well with combinatorial. functions. The GAs are a stochastic method based on concepts from biological genetics. They take a group of candidate solutions, called the population, and use selection, crossover, and mutation operators to create the next generation of better solutions. The selection operator is a {open_quotes}survival-of-the-fittest{close_quotes} operation and chooses the solutions for the next generation. The crossover operator is analogous to biological mating, where children inherit a mixture of traits from their parents, and the mutation operator makes small random changes to the solutions.

  14. Coding Long Contour Shapes of Binary Objects

    NASA Astrophysics Data System (ADS)

    Sánchez-Cruz, Hermilo; Rodríguez-Díaz, Mario A.

    This is an extension of the paper appeared in [15]. This time, we compare four methods: Arithmetic coding applied to 3OT chain code (Arith-3OT), Arithmetic coding applied to DFCCE (Arith-DFCCE), Huffman coding applied to DFCCE chain code (Huff-DFCCE), and, to measure the efficiency of the chain codes, we propose to compare the methods with JBIG, which constitutes an international standard. In the aim to look for a suitable and better representation of contour shapes, our probes suggest that a sound method to represent contour shapes is 3OT, because Arithmetic coding applied to it gives the best results regarding JBIG, independently of the perimeter of the contour shapes.

  15. Random Binning and Turbo Source Coding for Lossless Compression of Memoryless Sources

    Microsoft Academic Search

    Javad Haghighat; Walaa Hamouda; M. Reza Soleymani

    2006-01-01

    We propose a tree structured variable length random binning scheme that enables an error correcting code to act as a source code. The existing source coding schemes based on turbo codes, low density parity check codes, and repeat accumulate codes can be regarded as practical implementations of this random binning scheme. We investigate the performance of lossless turbo source coding

  16. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  17. A Mobile Functional Object Code Manuel M. T. Chakravarty

    E-print Network

    Chakravarty, Manuel

    . The proposed monadic object code, called Foc, supports these features. It allows an architecture independent on the functional origin of the code. Foc is not a classical byte code, but a strongly­typed functional language simplify optimisations during specialisation for the architecture it is executed on. Furthermore, Foc

  18. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  19. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The...

  20. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The...

  1. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The...

  2. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The...

  3. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The...

  4. Quantum mutual independence

    E-print Network

    Michal Horodecki; Jonathan Oppenheim; Andreas Winter

    2009-11-05

    We introduce the concept of mutual independence -- correlations shared between distant parties which are independent of the environment. This notion is more general than the standard idea of a secret key -- it is a fully quantum and more general form of privacy. The states which possess mutual independence also generalize the so called private states -- those that possess private key. We then show that the problem of distributed compression of quantum information at distant sources can be solved in terms of mutual independence, if free entanglement between the senders and the receiver is available. Namely, we obtain a formula for the sum of rates of qubits needed to transmit a distributed state between Alice and Bob to a decoder Charlie. We also show that mutual independence is bounded from above by the relative entropy modulo a conjecture, saying that if after removal of a single qubit the state becomes product, its initial entanglement is bounded by 1. We suspect that mutual independence is a highly singular quantity, i.e. that it is positive only on a set of measure zero; furthermore, we believe that its presence is seen on the single copy level. This appears to be born out in the classical case.

  5. Multilevel LDPC-Coded High-Speed Optical Systems: Efficient Hard Decoding and Code Optimization

    Microsoft Academic Search

    Chen Gong; Xiaodong Wang

    2010-01-01

    We consider a multilevel coding scheme employing low-density parity-check (LDPC) codes and high-order modulations for high-speed optical transmissions, where the coherent receiver performs either parallel independent decoding (PID) or multistage decoding (MSD). To meet the severe complexity constraint imposed by the ultrahigh data rate of the emerging optical transmission systems, we focus on hard-decision decoding of LDPC codes. A new

  6. Lymphoma Coding Guidelines

    Cancer.gov

    Coding Guidelines LYMPHOMA M9590/3-M9738/3 See the Hematopoietic and Lymphoid Neoplasm Case Reportability and Coding Manual and the Hematopoietic Database (DB) for more information and coding instructions. First Course of Therapy Do not code

  7. INVESTIGATION Targeted Capture of Homoeologous Coding

    E-print Network

    Wendel, Jonathan F.

    research inquiries involving polyploid plant genomes. KEYWORDS Gossypium allopolyploidy homoeologs sequence efficient and relatively inexpensive sequencing of hundreds to thousands of genes or genomic regions from many more individuals than is practical using whole-genome sequencing approaches. Here, we demonstrate

  8. Multifold Euclidean geometry codes

    Microsoft Academic Search

    Shu Lin

    1973-01-01

    This paper presents a class of majority-logic decodable codes whose structure is based on the structural properties of Euclidean geometries (EG) and codes that are invariant under the affine group of permutations. This new class of codes contains the ordinary EG codes and some generalized EG codes as subclasses. One subclass of new codes is particularly interesting: they are the

  9. Appendix A - County Codes

    Cancer.gov

    January 1998 SEER Program Code Manual, 3 rd Edition A-1 APPENDIX A COUNTY CODES APPENDIX A COUNTY CODES A-2 SEER Program Code Manual, 3rd Edition January 1998 The following are the valid county codes for coding county of residence at diagnosis: Reference:

  10. The Gray Code

    Microsoft Academic Search

    Robert W. Doran

    2007-01-01

    Here we summarise the properties and algorithms of the Gray code. De- scriptions are given of the Gray code definition, algorithms and circuits for gener- ating the code and for conversion between binary and Gray code, for incrementing, counting, and adding Gray code words. Some interesting applications of the code are also treated. Java implementations of the algorithms in this

  11. Coding for Satellite Communication

    Microsoft Academic Search

    William W. Wu; David Haccoun; Robert Peile; Yasuo Hirata

    1987-01-01

    This paper discusses a number of coding techniques for future satellite communication; they include Reed-Solomon error decoding for message blocks, probabilistic decoding techniques for punctured convolutional codes, and planar Euclidean geometry difference set codes for random multiple access applications. The provision of code concatenation, helical interleaving, and simulation results of new punctured convolutional codes are included. A number of coded

  12. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  13. Independent NOAA considered

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    A proposal to pull the National Oceanic and Atmospheric Administration (NOAA) out of the Department of Commerce and make it an independent agency was the subject of a recent congressional hearing. Supporters within the science community and in Congress said that an independent NOAA will benefit by being more visible and by not being tied to a cabinet-level department whose main concerns lie elsewhere. The proposal's critics, however, cautioned that making NOAA independent could make it even more vulnerable to the budget axe and would sever the agency's direct access to the President.The separation of NOAA from Commerce was contained in a June 1 proposal by President Ronald Reagan that also called for all federal trade functions under the Department of Commerce to be reorganized into a new Department of International Trade and Industry (DITI).

  14. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  15. Distributed single source coding with side information

    NASA Astrophysics Data System (ADS)

    Vila-Forcen, Jose E.; Koval, Oleksiy; Voloshynovskiy, Sviatoslav V.

    2004-01-01

    In the paper we advocate image compression technique in the scope of distributed source coding framework. The novelty of the proposed approach is twofold: classical image compression is considered from the positions of source coding with side information and, contrarily to the existing scenarios, where side information is given explicitly, side information is created based on deterministic approximation of local image features. We consider an image in the transform domain as a realization of a source with a bounded codebook of symbols where each symbol represents a particular edge shape. The codebook is image independent and plays the role of auxiliary source. Due to the partial availability of side information at both encoder and decoder we treat our problem as a modification of Berger-Flynn-Gray problem and investigate a possible gain over the solutions when side information is either unavailable or available only at decoder. Finally, we present a practical compression algorithm for passport photo images based on our concept that demonstrates the superior performance in very low bit rate regime.

  16. NERO- a post-maximum supernova radiation transport code

    NASA Astrophysics Data System (ADS)

    Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.

    2011-12-01

    The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.

  17. Utilizing sequence intrinsic composition to classify protein-coding and long non-coding transcripts.

    PubMed

    Sun, Liang; Luo, Haitao; Bu, Dechao; Zhao, Guoguang; Yu, Kuntao; Zhang, Changhai; Liu, Yuanning; Chen, Runsheng; Zhao, Yi

    2013-09-01

    It is a challenge to classify protein-coding or non-coding transcripts, especially those re-constructed from high-throughput sequencing data of poorly annotated species. This study developed and evaluated a powerful signature tool, Coding-Non-Coding Index (CNCI), by profiling adjoining nucleotide triplets to effectively distinguish protein-coding and non-coding sequences independent of known annotations. CNCI is effective for classifying incomplete transcripts and sense-antisense pairs. The implementation of CNCI offered highly accurate classification of transcripts assembled from whole-transcriptome sequencing data in a cross-species manner, that demonstrated gene evolutionary divergence between vertebrates, and invertebrates, or between plants, and provided a long non-coding RNA catalog of orangutan. CNCI software is available at http://www.bioinfo.org/software/cnci. PMID:23892401

  18. V(D)J recombination coding junction formation without DNA homology: processing of coding termini.

    PubMed Central

    Boubnov, N V; Wills, Z P; Weaver, D T

    1993-01-01

    Coding junction formation in V(D)J recombination generates diversity in the antigen recognition structures of immunoglobulin and T-cell receptor molecules by combining processes of deletion of terminal coding sequences and addition of nucleotides prior to joining. We have examined the role of coding end DNA composition in junction formation with plasmid substrates containing defined homopolymers flanking the recombination signal sequence elements. We found that coding junctions formed efficiently with or without terminal DNA homology. The extent of junctional deletion was conserved independent of coding ends with increased, partial, or no DNA homology. Interestingly, G/C homopolymer coding ends showed reduced deletion regardless of DNA homology. Therefore, DNA homology cannot be the primary determinant that stabilizes coding end structures for processing and joining. PMID:8413286

  19. Conservation IEAB Independent Economic

    E-print Network

    role of environmental credit marketsPotential role of environmental credit markets Impacts from Council IEAB Independent Economic Analysis Board Environmental Credit MarketsEnvironmental Credit Markets benefitsacres that have carbon sequestration benefits Can the carbon credits from habitat projects offset carbon

  20. Independent power generator

    NASA Technical Reports Server (NTRS)

    Young, R. N. (inventor)

    1978-01-01

    A gas turbine powered aircraft auxiliary power system is described which is capable of efficiently supplying all aircraft auxiliary services both in flight and on the ground and is further capable of operating independently of the aircraft main engines. The system employs multiple gas turbine compressor stages, thereby accomplishing cabin pressurization, ventilation and heating.

  1. IEAB Independent Analysis Board

    E-print Network

    ­Effectiveness of Improved Irrigation Efficiency and Water Transactions for Instream Flow for Fish1 Independent Economic for their support and helpful comments. #12;IEAB: Irrigation Efficiency and Water Transactions December 2011 1 Table ................................................................................. 16 3.0 Costs of Irrigation Efficiency and Water Transaction Programs

  2. Discriminant independent component analysis.

    PubMed

    Dhir, Chandra Shekhar; Lee, Soo-Young

    2011-06-01

    A conventional linear model based on Negentropy maximization extracts statistically independent latent variables which may not be optimal to give a discriminant model with good classification performance. In this paper, a single-stage linear semisupervised extraction of discriminative independent features is proposed. Discriminant independent component analysis (dICA) presents a framework of linearly projecting multivariate data to a lower dimension where the features are maximally discriminant with minimal redundancy. The optimization problem is formulated as the maximization of linear summation of Negentropy and weighted functional measure of classification. Motivated by independence among extracted features, Fisher linear discriminant is used as the functional measure of classification. Experimental results show improved classification performance when dICA features are used for recognition tasks in comparison to unsupervised (principal component analysis and ICA) and supervised feature extraction techniques like linear discriminant analysis (LDA), conditional ICA, and those based on information theoretic learning approaches. dICA features also give reduced data reconstruction error in comparison to LDA and ICA method based on Negentropy maximization. PMID:21521666

  3. Independence of Velocity

    NSDL National Science Digital Library

    Michael Horton

    2009-05-30

    This inquiry activity should be completed before discussing with students that a projectile's motion in the vertical direction is independent of its motion in the horizontal direction. As long as students use their apparatus carefully and don't flip coins

  4. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  5. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T., E-mail: jonastyleranderson@gmail.com

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  6. Error locating for plausible Wyner-Ziv video coding using turbo codes

    Microsoft Academic Search

    R. Hansel; E. Muller

    2009-01-01

    Distributed video coding (DVC, Wyner-Ziv coding) gained a lot of interest in this research decade. The major application of DVC is low complexity video encoding, which is well investigated in the literature. Generally, a distributed video coding system uses a feedback channel for rate control, otherwise the data send cannot be used for improved reconstruction quality in most cases. The

  7. A generalized interface module for the coupling of spatial kinetics and thermal-hydraulics codes

    Microsoft Academic Search

    D. A. Barber; R. M. Miller; H. G. Joo; T. J. Downar; W. Wang; V. A. Mousseau; D. D. Ebert

    1999-01-01

    A generalized interface module has been developed for the coupling of any thermal-hydraulics code to any spatial kinetics code. The coupling scheme was designed and implemented with emphasis placed on maximizing flexibility while minimizing modifications to the respective codes. In this design, the thermal-hydraulics, general interface, and spatial kinetics codes function independently and utilize the Parallel Virtual Machine software to

  8. Effect of separation efficiency on repository loading values in fuel cycle scenario analysis codes

    SciTech Connect

    Radel, T.E.; Wilson, P.P.H.; Grady, R.M. [U. Wisconsin-Madison, 1500 Engineering Dr, Madison, WI 53711 (United States); Bauer, T.H. [Argonne National Laboratory, 9700 S. Cass Ave, Argonne, IL, 60439 (United States)

    2007-07-01

    Fuel cycle scenario analysis codes are valuable tools for investigating the effects of various decisions on the performance of the nuclear fuel cycle as a whole. Until recently, repository metrics in such codes were based on mass and were independent of the isotopic composition of the waste. A methodology has been developed for determining peak repository loading for an arbitrary set of isotopics based on the heat load restrictions and current geometry specifications for the Yucca Mountain repository. This model was implemented in the VISION fuel cycle scenario analysis code and is used here to study the effects of separation efficiencies on repository loading for various AFCI fuel cycle scenarios. Improved separations efficiencies are shown to have continuing technical benefit in fuel cycles that recycle Am and Cm, but a substantial benefit can be achieved with modest separation efficiencies. (authors)

  9. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2014-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  10. Independent Lens: Interactive

    NSDL National Science Digital Library

    2005-01-01

    Over the past few years, Independent Lens has produced a number of well-received documentaries that have aired on PBS and other places. They have also created some very nice websites in an attempt to enhance the viewing experience of their programs. The Independent Lens: Interactive site offers some additional web-original projects for the interested public. Some of these features include Beyond the Fire, which introduces visitors to the stories of fifteen teenagers living in the US, who have survived war in seven different regions. One very compelling highlight of the site is the Off the Map feature. Here visitors can learn about the visionary art produced by a selection of persons working in various media, such as bottle caps, matchsticks, and chewing gum. For those looking for something with a unique perspective on the world and its inhabitants, this website will definitely bring a smile to their eyes.

  11. Speaker-independent phone recognition using hidden Markov models

    Microsoft Academic Search

    Kai-fu Lee; Hsiao-wuen Hon

    1989-01-01

    Hidden Markov modeling is extended to speaker-independent phone recognition. Using multiple codebooks of various linear-predictive-coding (LPC) parameters and discrete hidden Markov models (HMMs) the authors obtain a speaker-independent phone recognition accuracy of 58.8-73.8% on the TIMIT database, depending on the type of acoustic and language models used. In comparison, the performance of expert spectrogram readers is only 69% without use

  12. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  13. Fire investigation

    NASA Astrophysics Data System (ADS)

    Gomberg, A.

    There was considerable progress made on several fronts of fire investigation in the United States in recent years. Progress was made in increasing the quantity of fire investigation and reporting, through efforts to develop the National Fire Incident Reporting System. Improving overall quality of fire investigation is the objective of efforts such as the Fire Investigation Handbook, which was developed and published by the National Bureau of Standards, and the upgrading and expanding of the ""dictionary'' of fire investigation and reporting, the NFPA 901, Uniform Coding for Fire Protection, system. The science of fire investigation as furthered also by new approaches to post fire interviews being developed at the University of Washington, and by in-depth research into factors involved in several large loss fires, including the MGM Grand Hotel in Las Vegas. Finally, the use of special study fire investigations - in-depth investigations concentrating on specific fire problems - is producing new glimpses into the nature of the national fire problem. A brief description of the status of efforts in each of these areas is discussed.

  14. Characterizing History Independent Data Structures

    E-print Network

    Bustamante, Fabián E.

    Characterizing History Independent Data Structures Jason D. Hartline1 , Edwin S. Hong1 , Alexander history independent data structures as proposed for study by Teague and Naor [2]. In a history independent is available from the abstract data structure. We show that for the most part, strong history independent data

  15. Characterizing History Independent Data Structures

    E-print Network

    Bustamante, Fabián E.

    Characterizing History Independent Data Structures Jason D. Hartline 1 , Edwin S. Hong 1 history independent data structures as proposed for study by Teague and Naor [2]. In a history independent is available from the abstract data structure. We show that for the most part, strong history independent data

  16. Myth or Truth: Independence Day.

    ERIC Educational Resources Information Center

    Gardner, Traci

    Most Americans think of the Fourth of July as Independence Day, but is it really the day the U.S. declared and celebrated independence? By exploring myths and truths surrounding Independence Day, this lesson asks students to think critically about commonly believed stories regarding the beginning of the Revolutionary War and the Independence Day…

  17. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints. PMID:25600501

  18. Hybrid Distributed Video Coding Using SCA Codes

    Microsoft Academic Search

    Emin Martinian; Anthony Vetro; Jonathan S. Yedidia; Joao Ascenso; Ashish Khisti; Dmitry Malioutov

    2006-01-01

    We describe the architecture for our distributed video coding (DVC) system. Some key differences between our work and previous systems include a new method of enabling decoder motion compensation, and the use of serially concatenated accumulate syndrome codes for distributed source coding. To evaluate performance, we compare our system to the H.263+ and H.264\\/AVC video codecs. Experiments show that our

  19. On quantum advantage in dense coding

    E-print Network

    M. Horodecki; M. Piani

    2009-09-11

    The quantum advantage of dense coding is studied, considering general encoding quantum operations. Particular attention is devoted to the case of many senders, and it is shown that restrictions on the possible operations on the senders' side may make some quantum state useless for dense-coding. It is shown, e.g., that some states are useful for dense coding if the senders can communicate classically (but not quantumly), yet they cannot be used for dense coding, if classical communication is not allowed. These no-go results are actually independent of the particular quantification of the quantum advantage, being valid for any reasonable choice. It is further shown that the quantum advantage of dense coding satisfies a monogamy relation with the so-called entanglement of purification.

  20. The Influence of Board Independence, Competency and Ownership on Earnings Management in Malaysia

    Microsoft Academic Search

    HASHIMAH JOHARI; MOHD SALEH; SABRI HASSAN

    2008-01-01

    This paper examines the roles of independent members on the board, chief executive officer who also serves as a chairman of the company, board competency and management's share ownership on earnings management practices. Different from prior research, it also investigates whether independent board competency (an interaction of independence and competency) and independent board share ownership (an interaction of independence and

  1. Generalized Concatenated Quantum Codes

    E-print Network

    Markus Grassl; Peter Shor; Graeme Smith; John Smolin; Bei Zeng

    2009-01-09

    We introduce the concept of generalized concatenated quantum codes. This generalized concatenation method provides a systematical way for constructing good quantum codes, both stabilizer codes and nonadditive codes. Using this method, we construct families of new single-error-correcting nonadditive quantum codes, in both binary and nonbinary cases, which not only outperform any stabilizer codes for finite block length, but also asymptotically achieve the quantum Hamming bound for large block length.

  2. Codeword Stabilized Quantum Codes

    E-print Network

    Andrew Cross; Graeme Smith; John A. Smolin; Bei Zeng

    2007-09-27

    We present a unifying approach to quantum error correcting code design that encompasses additive (stabilizer) codes, as well as all known examples of nonadditive codes with good parameters. We use this framework to generate new codes with superior parameters to any previously known. In particular, we find ((10,18,3)) and ((10,20,3)) codes. We also show how to construct encoding circuits for all codes within our framework.

  3. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  4. Subsystem codes with spatially local generators

    SciTech Connect

    Bravyi, Sergey [IBM T. J. Watson Research Center, Yorktown Heights, New York 10598 (United States)

    2011-01-15

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d{sup 2}=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd{sup 2}=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  5. Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations

    NASA Astrophysics Data System (ADS)

    Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.

    The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4 kV.cm-1, 0.66 kV.cm-1 and 0.8 kV.cm-1. The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

  6. An independent hydrogen source

    SciTech Connect

    Kobzenko, G.F.; Chubenko, M.V.; Kobzenko, N.S.; Senkevich, A.I.; Shkola, A.A.

    1985-10-01

    Descriptions are given of the design and operation of an independent hydrogen source used in purifying and storing hydrogen. If LaNi/sub 5/ or TiFe is used as the sorbent, one can store about 500 liter of chemically bound hydrogen in a vessel of 0.9 liter. Molecular purification of the desorbed hydrogen is used. The IHS is a safe hydrogen source, since the hydrogen is trapped in the sorbent in the chemically bound state and in equilibrium with LaNi/sub 5/Hx at room temperature. If necessary, the IHS can serve as a compressor and provide higher hydrogen pressures. The device is compact and transportable.

  7. Code Attestation with Compressed Instruction Code

    E-print Network

    Vetter, Benjamin

    2011-01-01

    Available purely software based code attestation protocols have recently been shown to be cheatable. In this work we propose to upload compressed instruction code to make the code attestation protocol robust against a so called compresssion attack. The described secure code attestation protocol makes use of recently proposed microcontroller architectures for reading out compressed instruction code. We point out that the proposed concept only makes sense if the provided cost/benefit ratio for the aforementioned microcontroller is higher than an alternative hardware based solution requiring a tamperresistant hardware module.

  8. Independent Lens: Butte, America

    NSDL National Science Digital Library

    Butte, Montana was a hard rock mining town that supplied the United States with much-needed copper, due to the electrification of the nation. The documentary created by Independent Lens of PBS shows the hardship the miners and their families encountered. The Independent Lens website has a multitude of interactive features that adds depth and increased understanding to the film. To find when and on what PBS station the film is playing, visitors can click the link "Check Local Listings". Under the "The Film" tab, three clips of the film are available, and under "The Making of " tab, visitors can find details the difficulties of the film crew in filming the underground mining tunnels. The filmmaker also addresses the challenges of working in 16mm film, and the painful decisions of what scenes to cut. "Related Links" can also be found at the bottom of "The Film" link and provides links to several articles on the town of Butte, as well as to the filmmaker's website.

  9. Melanism in Peromyscus Is Caused by Independent Mutations in Agouti

    PubMed Central

    Kingsley, Evan P.; Manceau, Marie; Wiley, Christopher D.; Hoekstra, Hopi E.

    2009-01-01

    Identifying the molecular basis of phenotypes that have evolved independently can provide insight into the ways genetic and developmental constraints influence the maintenance of phenotypic diversity. Melanic (darkly pigmented) phenotypes in mammals provide a potent system in which to study the genetic basis of naturally occurring mutant phenotypes because melanism occurs in many mammals, and the mammalian pigmentation pathway is well understood. Spontaneous alleles of a few key pigmentation loci are known to cause melanism in domestic or laboratory populations of mammals, but in natural populations, mutations at one gene, the melanocortin-1 receptor (Mc1r), have been implicated in the vast majority of cases, possibly due to its minimal pleiotropic effects. To investigate whether mutations in this or other genes cause melanism in the wild, we investigated the genetic basis of melanism in the rodent genus Peromyscus, in which melanic mice have been reported in several populations. We focused on two genes known to cause melanism in other taxa, Mc1r and its antagonist, the agouti signaling protein (Agouti). While variation in the Mc1r coding region does not correlate with melanism in any population, in a New Hampshire population, we find that a 125-kb deletion, which includes the upstream regulatory region and exons 1 and 2 of Agouti, results in a loss of Agouti expression and is perfectly associated with melanic color. In a second population from Alaska, we find that a premature stop codon in exon 3 of Agouti is associated with a similar melanic phenotype. These results show that melanism has evolved independently in these populations through mutations in the same gene, and suggest that melanism produced by mutations in genes other than Mc1r may be more common than previously thought. PMID:19649329

  10. Description of ground motion data processing codes: Volume 3

    SciTech Connect

    Sanders, M.L.

    1988-02-01

    Data processing codes developed to process ground motion at the Nevada Test Site for the Weapons Test Seismic Investigations Project are used today as part of the program to process ground motion records for the Nevada Nuclear Waste Storage Investigations Project. The work contained in this report documents and lists codes and verifies the ``PSRV`` code. 39 figs.

  11. A space-time coded OFDM with dual Viterbi decoder

    Microsoft Academic Search

    Seog Geun Kang; Chi Chung Ko

    2002-01-01

    In this paper, a space-time coded orthogonal frequency division multiplexing (STC-OFDM) scheme with dual Viterbi decoder is proposed and analyzed. In this scheme, two independent half-rate OFDM symbols are generated after convolutional coding. A dual Viterbi decoder is used for the independent decoding of the recovered sequence, and their path metrics are compared. Accordingly, the recovered binary data is a

  12. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  13. Good quantum error-correcting codes exist

    Microsoft Academic Search

    A. R. Calderbank; Peter W. Shor

    1996-01-01

    A quantum error-correcting code is defined to be a unitary mapping (encoding) of {ital k} qubits (two-state quantum systems) into a subspace of the quantum state space of {ital n} qubits such that if any {ital t} of the qubits undergo arbitrary decoherence, not necessarily independently, the resulting {ital n} qubits can be used to faithfully reconstruct the original quantum

  14. Good quantum error-correcting codes exist

    Microsoft Academic Search

    A. R. Calderbank; Peter W. Shor

    1995-01-01

    A quantum error-correcting code is defined to be a unitary mapping (encod- ing) of k qubits (2-state quantum systems) into a subspace of the quantum state space of n qubits such that if any t of the qubits undergo arbitrary decoherence, not necessarily independently, the resulting n qubits can be used to faithfully reconstruct the original quantum state of the

  15. Speech coding based upon vector quantization

    Microsoft Academic Search

    ANDRES BUZO; R. Gray; J. Markel

    1980-01-01

    With rare exception, all presently available narrow-band speech coding systems implement scalar quantization (independent quantization) of the transmission parameters (such as reflection coefficients or transformed reflection coefficients in LPC systems). This paper presents a new approach called vector quantization. For very low data rates, realistic experiments have shown that vector quantization can achieve a given level of average distortion with

  16. An interactive morse code emulation management system

    Microsoft Academic Search

    Cheng-Hong Yang

    2003-01-01

    Assistive technology (AT) is becoming increasingly important in improving mobility, language, and learning capabilities of persons who have disabilities enabling them to function independently and to improve their social opportunities. Morse code has been shown to be a valuable tool in Assistive technology, augmentative and alternative communication, rehabilitation, and education, as well as adapted computer access methods via special software

  17. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  18. Can gesture establish an independent communication channel?

    Microsoft Academic Search

    Yong Xu I; Kazuhiro Ueda; Takanori Komatsu; Takeshi Okadome

    2007-01-01

    There exist two types of communication channels in human communication: verbal channel and nonverbal channel. Gesture is one of the most often-used channels in nonverbal communication since people frequently use gestures to communicate. In order to investigate whether gestures can establish an independent communication channel in human dyadic communication, authors conducted an experiment using a maze exploration task for observing

  19. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  20. Bit Interleaved Coded Multiple Beamforming

    Microsoft Academic Search

    Enis Akay; Ersin Sengul; Ender Ayanoglu

    2007-01-01

    In this paper, we investigate the performance of bit-interleaved coded multiple beamforming (BICMB). We provide interleaver design criteria such that BICMB achieves full spatial multiplexing of min( N, M) and full spatial diversity of NM with N transmit and M receive antennas over quasi-static Rayleigh flat fading channels. If the channel is frequency selective, then BICMB is combined with orthogonal

  1. Performance bounds for fractal coding

    Microsoft Academic Search

    Bernd Hiirtgen; Rwth Aachen

    1995-01-01

    Reports on investigations concerning the performance of fractal transforms. Emerging from the structural constraints of fractal coding schemes, lower bounds for the reconstruction error are given without regarding quantization noise. This implies finding an at least locally optimal transformation matrix. A full search approach is by definition optimal but also intractable for practical implementations. In order to simplify the calculation

  2. Publications NODC Taxonomic Code and

    E-print Network

    of worldwide flora and fauna from viruses to mammals. The code was developed to simplify and systematize computer processing of data about ma- rine organisms. Because of its flexibility and scope, however, it can exchange of data collected by different investigators and the production of uniform computer-generated data

  3. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  4. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  5. Frame independent cosmological perturbations

    SciTech Connect

    Prokopec, Tomislav; Weenink, Jan, E-mail: t.prokopec@uu.nl, E-mail: j.g.weenink@uu.nl [Institute for Theoretical Physics and Spinoza Institute, Utrecht University, Leuvenlaan 4, 3585 CE Utrecht (Netherlands)

    2013-09-01

    We compute the third order gauge invariant action for scalar-graviton interactions in the Jordan frame. We demonstrate that the gauge invariant action for scalar and tensor perturbations on one physical hypersurface only differs from that on another physical hypersurface via terms proportional to the equation of motion and boundary terms, such that the evolution of non-Gaussianity may be called unique. Moreover, we demonstrate that the gauge invariant curvature perturbation and graviton on uniform field hypersurfaces in the Jordan frame are equal to their counterparts in the Einstein frame. These frame independent perturbations are therefore particularly useful in relating results in different frames at the perturbative level. On the other hand, the field perturbation and graviton on uniform curvature hypersurfaces in the Jordan and Einstein frame are non-linearly related, as are their corresponding actions and n-point functions.

  6. Omnidirectional coded loudspeaker arrays

    Microsoft Academic Search

    S. El-Khamy; O. Abdel-Alim

    1983-01-01

    The feeding of loudspeaker arrays by special sequences, or codes, with the purpose of obtaining isotropic radiation intensity patterns is considered in this paper. In particular, new codes of the Huffman codes type, which are generated by combination of Barker codes, are considered. This type of feeding is shown to result in almost isotropic patterns which are superior to those

  7. Decoding of scroll codes

    E-print Network

    Hitching, George H

    2007-01-01

    We define and study a class of codes obtained from scrolls over curves of any genus over finite fields. These codes generalize Goppa codes in a natural way, and the orthogonal complements of these codes belong to the same class. We show how syndromes of error vectors correspond to certain vector bundle extensions, and how decoding is associated to finding destabilizing subbundles.

  8. Bit-Wise Arithmetic Coding For Compression Of Data

    NASA Technical Reports Server (NTRS)

    Kiely, Aaron

    1996-01-01

    Bit-wise arithmetic coding is data-compression scheme intended especially for use with uniformly quantized data from source with Gaussian, Laplacian, or similar probability distribution function. Code words of fixed length, and bits treated as being independent. Scheme serves as means of progressive transmission or of overcoming buffer-overflow or rate constraint limitations sometimes arising when data compression used.

  9. Multimodal authentication based on random projections and source coding

    Microsoft Academic Search

    Sviatoslav Voloshynovskiy; Oleksiy J. Koval; Thierry Pun

    2008-01-01

    In this paper, we consider an authentication framework for independent modalities based on binary hypothesis testing using source coding jointly with the random projections. The source coding ensures the multimodal signals recon- struction at the decoder based on the authentication data. The random projections are used to cope with the security, privacy, robustness and complexity issues. Finally, the au- thentication

  10. Parafermion stabilizer codes

    E-print Network

    Utkan Güngördü; Rabindra Nepal; Alexey A. Kovalev

    2014-10-29

    We define and study parafermion stabilizer codes which can be viewed as generalizations of Kitaev's one dimensional model of unpaired Majorana fermions. Parafermion stabilizer codes can protect against low-weight errors acting on a small subset of parafermion modes in analogy to qudit stabilizer codes. Examples of several smallest parafermion stabilizer codes are given. A locality preserving embedding of qudit operators into parafermion operators is established which allows one to map known qudit stabilizer codes to parafermion codes. We also present a local 2D parafermion construction that combines topological protection of Kitaev's toric code with additional protection relying on parity conservation.

  11. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  12. Studying the Independent School Library

    ERIC Educational Resources Information Center

    Cahoy, Ellysa Stern; Williamson, Susan G.

    2008-01-01

    In 2005, the American Association of School Librarians' Independent Schools Section conducted a national survey of independent school libraries. This article analyzes the results of the survey, reporting specialized data and information regarding independent school library budgets, collections, services, facilities, and staffing. Additionally, the…

  13. Software for universal noiseless coding

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Schlutsmeyer, A. P.

    1981-01-01

    An overview is provided of the universal noiseless coding algorithms as well as their relationship to the now available FORTRAN implementations. It is suggested that readers considering investigating the utility of these algorithms for actual applications should consult both NASA's Computer Software Management and Information Center (COSMIC) and descriptions of coding techniques provided by Rice (1979). Examples of applying these techniques have also been given by Rice (1975, 1979, 1980). Attention is given to reversible preprocessing, general implementation instructions, naming conventions, and calling arguments. A general applicability of the considered algorithms to solving practical problems is obtained because most real data sources can be simply transformed into the required form by appropriate preprocessing.

  14. Graph concatenation for quantum codes

    E-print Network

    Beigi, Salman

    Graphs are closely related to quantum error-correcting codes: every stabilizer code is locally equivalent to a graph code and every codeword stabilized code can be described by a graph and a classical code. For the ...

  15. Independent Lens Strange Fruit

    NSDL National Science Digital Library

    The accompanying website for the Independent Lens film "Strange Fruit", about the famous protest song, allows visitors to hear a clip, or the entire song, of a famous rendition sung Billie Holiday. Strange Fruit is a phrase that actually comes from a poem that was turned into a song, and the song became the most renowned protest song of the 1940s. Visitors unfamiliar with the song will find that the link, "The Film", on the homepage gives an informative several paragraph synopsis and history. It also explains the unusual turns the life of the poet/songwriter took. Visitors should not miss the "Protest Music Overview" link, which provides clips of other protest songs. These protest songs are grouped by time period and the topic of protest for the period. Visitors should start at the beginning with 1776 and slavery, and then just wander through the centuries of music. Some of the clips featured within the different time periods include "Fight The Power" by Public Enemy, "Ohio" by Neil Young, and "We Shall Overcome" sung by Mahalia Jackson.

  16. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  17. Validation of a new computer program for Minnesota coding.

    PubMed

    Kors, J A; van Herpen, G; Wu, J; Zhang, Z; Prineas, R J; van Bemmel, J H

    1996-01-01

    The Minnesota code (MC) is a classification system for electrocardiograms (ECGs) that is used for ECG coding in epidemiologic studies. As the MC measurement procedures and rules are complex, visual coding is time-consuming and error-prone. Automation should reduce measurement and coding errors. The authors developed an MC program, closely adhering to the MC regulations. To validate the program, a test set of 300 ECGs containing a wide variety of codable patterns was collected. The ECGs were coded independently by the program and by an experienced human reader. A reference code ("truth") was established by resolving disagreements through a consensus procedure. If the computer and human agreed, they were considered to be correct. Sensitivity and specificity were computed for each of the nine main code categories of the MC, both for the computer and for visual coding. The results show that the program is as good as or better than the human reader for sensitivity and specificity of all MC categories. Particularly noteworthy is the good program performance for arrhythmia coding. Most coding differences between the program and truth arise from small, borderline measurement differences in combination with the all-or-none character of the coding criteria. In conclusion, computerized Minnesota coding is a valuable alternative or supplement to visual coding. PMID:9238383

  18. Energy aware network coding in wireless networks

    E-print Network

    Shi, Xiaomeng, Ph. D. Massachusetts Institute of Technology

    2012-01-01

    Energy is one of the most important considerations in designing reliable low-power wireless communication networks. We focus on the problem of energy aware network coding. In particular, we investigate practical energy ...

  19. Character coding of secondary chemical variation for use in phylogenetic analyses.

    PubMed

    Barkman

    2001-01-01

    A coding procedure is presented for secondary chemical data whereby putative biogenetic pathways are coded as phylogenetic characters with enzymatic conversions between compounds representing the corresponding character states. A character state tree or stepmatrix allows direct representation of the secondary chemical biogenetic pathway and avoids problems of non-independence associated with coding schemes that score presence/absence of individual compounds. Stepmatrices are the most biosynthetically realistic character definitions because individual and population level polymorphisms can be scored, reticulate enzymatic conversions within pathways may be represented, and down-weighting of pathway loss versus gain is possible. The stepmatrix approach unifies analyses of secondary chemicals, allozymes, and developmental characters because the biological unity of the pathway, locus, or character ontogeny is preserved. Empirical investigation of the stepmatrix and character state tree coding methods using floral fragrance data in Cypripedium (Orchidaceae) resulted in cladistic relationships which were largely congruent with those suggested from recent DNA and allozyme studies. This character coding methodology provides an effective means for including secondary compound data in total evidence studies. Furthermore, ancestral state reconstructions provide a phylogenetic context within which biochemical pathway evolution may be studied. PMID:11068120

  20. A distributed code for color in natural scenes derived from center-surround filtered cone signals

    PubMed Central

    Kellner, Christian J.; Wachtler, Thomas

    2013-01-01

    In the retina of trichromatic primates, chromatic information is encoded in an opponent fashion and transmitted to the lateral geniculate nucleus (LGN) and visual cortex via parallel pathways. Chromatic selectivities of neurons in the LGN form two separate clusters, corresponding to two classes of cone opponency. In the visual cortex, however, the chromatic selectivities are more distributed, which is in accordance with a population code for color. Previous studies of cone signals in natural scenes typically found opponent codes with chromatic selectivities corresponding to two directions in color space. Here we investigated how the non-linear spatio-chromatic filtering in the retina influences the encoding of color signals. Cone signals were derived from hyper-spectral images of natural scenes and preprocessed by center-surround filtering and rectification, resulting in parallel ON and OFF channels. Independent Component Analysis (ICA) on these signals yielded a highly sparse code with basis functions that showed spatio-chromatic selectivities. In contrast to previous analyses of linear transformations of cone signals, chromatic selectivities were not restricted to two main chromatic axes, but were more continuously distributed in color space, similar to the population code of color in the early visual cortex. Our results indicate that spatio-chromatic processing in the retina leads to a more distributed and more efficient code for natural scenes. PMID:24098289

  1. Adaptive coding of reward prediction errors is gated by striatal coupling.

    PubMed

    Park, Soyoung Q; Kahnt, Thorsten; Talmi, Deborah; Rieskamp, Jörg; Dolan, Raymond J; Heekeren, Hauke R

    2012-03-13

    To efficiently represent all of the possible rewards in the world, dopaminergic midbrain neurons dynamically adapt their coding range to the momentarily available rewards. Specifically, these neurons increase their activity for an outcome that is better than expected and decrease it for an outcome worse than expected, independent of the absolute reward magnitude. Although this adaptive coding is well documented, it remains unknown how this rescaling is implemented. To investigate the adaptive coding of prediction errors and its underlying rescaling process, we used human functional magnetic resonance imaging (fMRI) in combination with a reward prediction task that involved different reward magnitudes. We demonstrate that reward prediction errors in the human striatum are expressed according to an adaptive coding scheme. Strikingly, we show that adaptive coding is gated by changes in effective connectivity between the striatum and other reward-sensitive regions, namely the midbrain and the medial prefrontal cortex. Our results provide evidence that striatal prediction errors are normalized by a magnitude-dependent alteration in the interregional connectivity within the brain's reward system. PMID:22371590

  2. Adaptive coding of reward prediction errors is gated by striatal coupling

    PubMed Central

    Park, Soyoung Q.; Kahnt, Thorsten; Talmi, Deborah; Rieskamp, Jörg; Dolan, Raymond J.; Heekeren, Hauke R.

    2012-01-01

    To efficiently represent all of the possible rewards in the world, dopaminergic midbrain neurons dynamically adapt their coding range to the momentarily available rewards. Specifically, these neurons increase their activity for an outcome that is better than expected and decrease it for an outcome worse than expected, independent of the absolute reward magnitude. Although this adaptive coding is well documented, it remains unknown how this rescaling is implemented. To investigate the adaptive coding of prediction errors and its underlying rescaling process, we used human functional magnetic resonance imaging (fMRI) in combination with a reward prediction task that involved different reward magnitudes. We demonstrate that reward prediction errors in the human striatum are expressed according to an adaptive coding scheme. Strikingly, we show that adaptive coding is gated by changes in effective connectivity between the striatum and other reward-sensitive regions, namely the midbrain and the medial prefrontal cortex. Our results provide evidence that striatal prediction errors are normalized by a magnitude-dependent alteration in the interregional connectivity within the brain's reward system. PMID:22371590

  3. Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels

    NASA Technical Reports Server (NTRS)

    Moher, Michael L.; Lodge, John H.

    1990-01-01

    A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.

  4. Coding AuthentiCity

    E-print Network

    Mercier, Rachel Havens

    2008-01-01

    This thesis analyzes the impact of form-based codes, focusing on two research questions: (1) What is the underlying motivation for adopting a form-based code? (2) What motivations have the most significant impact on ...

  5. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  6. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  7. Concentric Permutation Source Codes

    E-print Network

    Nguyen, Ha Q.

    Permutation codes are a class of structured vector quantizers with a computationally-simple encoding procedure based on sorting the scalar components. Using a codebook comprising several permutation codes as subcodes ...

  8. International Code Council

    NSDL National Science Digital Library

    The International Code Council is �a membership association dedicated to building safety and fire prevention, develops the codes used to construct residential and commercial buildings, including homes and schools. Most U.S. cities, counties and states that adopt codes choose the International Codes developed by the International Code Council.� Although some sections of the site are reserved for members only (which requires a fee), there is a remarkable amount of material available to non-members. Available on the website are details about codes development, how to acquire an opinion on a code from multiple sources and how to reach a building code liaison for your locality. Under the �Certification and Testing� tab, users can find sample certification exam questions as well as outlines. The site also provides links to various periodicals, ICC meetings and also includes an event calendar to see dates for industry conferences and upcoming trade shows.

  9. A robust low-rate coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (editor)

    1991-01-01

    Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.

  10. INVESTIGATION OF FISCALLY INDEPENDENT AND DEPENDENT CITY SCHOOL DISTRICTS.

    ERIC Educational Resources Information Center

    GITTELL, MARILYN; AND OTHERS

    A TWO-PART COMPARATIVE ANALYSIS IS MADE OF LARGE AND SMALL CITY SCHOOL SYSTEMS. PART I ANALYZES A WIDE RANGE OF FISCAL AND NON-FISCAL VARIABLES ASSOCIATED WITH FISCAL STATUS OF CITY SCHOOL SYSTEMS. IT COVERS THE 2,788 CITY SCHOOL DISTRICTS IN THE UNITED STATES WITH ENROLLMENTS OVER 3,000. COMPLEX INTERRELATIONSHIPS SURROUNDING FISCAL STATUS IN…

  11. Growing Salt: An Independent Course Research Project Investigating Chemical Sediments

    NSDL National Science Digital Library

    Kathy Benison

    To prepare for this project, students read a journal article about the processes and products of chemical sedimentation and early diagenesis in saline pan environments (Lowenstein and Hardie, 1985). In class, students are given some handouts that tabluate various evaporite minerals and how water chemistry affects their formation and dissolution. A short slide show and video illustrate some different types of saline environments. Photos and samples guide a lecture on the formation of different types of evaporite minerals and how they form. For example, chevron halite crystals are generally large (cm-scale) and grow upward from the floor of a shallow (less than ~0.5 m) surface water body; cumulate halite crystals are smaller (typically mm-scale) and grow on the water-air interface and settle to the bottom, regardless of water depth. Randomly-oriented halite crystals can grow displacively from groundwater in mud or sand. The students learn that the specific sedimentology of halite can be used to trace past surface water depth and groundwater salinity. I also give examples of how past quantitative climate data, past chemical data and even past microbiologial data can be interpreted from evaporites. I emphasize how, in order to understand evaporites, one must think critically about sedimentology and geochemistry. The students are told, at the end of this lecture, that their next lab period will focus on designing and setting up a research project on growing salt. They are encouraged to start thinking about a research question they can pose about evaporite sedimentology. At this time, I also tell them what materials are available for their use (tap water, distilled water, seawater, various types of saline water I have collected during field trips, various types of store-bought table and road salt (including iodized, non-iodized, sea salt, etc.). A variety of table salts can be purchased cheaply (~$1 - $2/carton) at almost any grocery store. If you live in a cold climate, most grocery stores and hardware stores also sell several types of road salt (~$3-$4/bag). The table salts are mostly Na and Cl; some have lesser amopunts of Ca and SO4. Some road salts have Ca, Mg, Na, and Cl. In my experience, one carton and one bag of each type will provide more than enough salt for a class of 15 students. When it is time for lab to begin, I gather my students in my research lab (but could also be done in a classroom), where I show them the materials I have available to them: various types of salt, various types of water, and plastic, glass, and metal containers of various shapes (baby food glass jars, plastic take-out food containers, etc). My lab also contains a variety of other miscellaneous materials, such as sand, gravel, clay, morter and pestle, wooden sticks, metal stirring rods, string, plastic tubing, beakers, food coloring (shows fluid inclusion bands well and everyone loves playing with food coloring), etc. I remind the students that they have a microwave oven, a freezer, a lab hood, a windowsill with plenty of sunlight, and a heating vent that can be used, as well. I make available a few thermometers, pH strips (or pH meter), and a hand-held refractometer for measuring salinity. These analytical field instruments are not neccessary for this assignment to work. However, as instructor, I would encourage you to use anything available to you. I ask each student to tell me informally of their research question/hypothesis and then I try to help them find any materials they need for their experiments. Here are some examples of student research questions that have been tested with this assignment: (1) Does temperature of water affect rate of haite/gypsum growth?: (2) Will evaporite minerals grown from a complex saline fluid form a "bulls eye" pattern as their textbook claims?; (3) Will halite grow preferentially on glass substrates versus wooden and plastic substrates?; (4) Will evaporation of salt water make halite cement equally well in a gravel, a sand, a clay?; (5) What conditions best produce large halite crystals?; (6) Does pH of

  12. Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity

    ERIC Educational Resources Information Center

    Zuercher, Kenneth

    2009-01-01

    From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

  13. P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code

    NASA Technical Reports Server (NTRS)

    Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)

    2000-01-01

    In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

  14. Auto-blocking matrix-multiplication or tracking BLAS3 performance from source code

    Microsoft Academic Search

    Jeremy D. Frens; David S. Wise

    1997-01-01

    An elementary, machine-independent, recursive algorithm for matrix multiplication C+=A*B provides implicit blocking at every level of the memory hierarchy and tests out faster than classically optimrd code, tracking hand-coded BLAS3 routines. Proof of concept is demonstrated by racing the in-place algorithm against manufacturer's hand-tuned BLAS3 routines; it can win.The recursive code bifurcates naturally at the top level into independent block-oriented

  15. Character coding of secondary chemical variation for use in phylogenetic analyses

    Microsoft Academic Search

    Todd J. Barkman

    2001-01-01

    A coding procedure is presented for secondary chemical data whereby putative biogenetic pathways are coded as phylogenetic characters with enzymatic conversions between compounds representing the corresponding character states. A character state tree or stepmatrix allows direct representation of the secondary chemical biogenetic pathway and avoids problems of non-independence associated with coding schemes that score presence\\/absence of individual compounds. Stepmatrices are

  16. Code Understanding and Generation

    Microsoft Academic Search

    Andrew Broad; Nick Filer

    This paper briefly reviews the applicability of case- based reasoning (CBR) to code understanding and generation. The paper suggests that case-based techniques are already common in code understanding and generation but are seldom labelled as CBR. Some examples of 'explicit' and 'covert' CBR are briefly examined. It is suggested that extending the existing code understanding and generation methods to make

  17. Morse Code Activity Packet.

    ERIC Educational Resources Information Center

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  18. Differential space-frequency coding for a multipath fading channel

    Microsoft Academic Search

    Fred C. Kellerman

    2004-01-01

    This paper will investigate differential space frequency coding and its applicability to multipath fading High Frequency (HF) radio channels. Orthogonal Frequency Division Multiplexing (OFDM) will be combined with differential Alamouti space frequency codes to measure performance on the Watterson HF channel model. Differential coding facilitates non-coherent reception and can thus also reduce receiver complexity. Numerical results will be shown for

  19. Code period effects in DS-CDMA systems

    NASA Astrophysics Data System (ADS)

    Giubilei, R.

    1995-03-01

    Code period effects in direct sequence code division multiple access (DS-CDMA) systems are investigated. It is shown that the improvement in the signal-to-interference ratio (SIR) achieved in the spread spectrum receiver is limited by a saturation value proportional to the code period.

  20. America\\'s Journey to Independence

    NSDL National Science Digital Library

    Ms. Nielsen

    2007-10-08

    *Examine the role of leaders that led to United States Independence *Trace the development of the U.S. Constitution Learn about the Founding Fathers and Constitution of the United States by reading valuable information and viewing many wonderful pictures: America s Founding Fathers Investigate the life of Benjamin Franklin, a man very influential in the forming of the American nation. View information, pictures, and video: Benjamin Franklin Check out facts of the Revolutionary War and how it ...

  1. Occupational Code Assignment System (CodeSearch)

    Cancer.gov

    The Codesearch System is a PC-based system which provides users with an interactive method for assigning standardized industry and occupation codes to related job description titles from specific studies.

  2. Self-Dual Codes

    Microsoft Academic Search

    E. M. Rains; N. J. A. Sloane

    2002-01-01

    Self-dual codes are important because many of the best codes known are of\\u000athis type and they have a rich mathematical theory. Topics covered in this\\u000asurvey include codes over F_2, F_3, F_4, F_q, Z_4, Z_m, shadow codes, weight\\u000aenumerators, Gleason-Pierce theorem, invariant theory, Gleason theorems,\\u000abounds, mass formulae, enumeration, extremal codes, open problems. There is a\\u000acomprehensive bibliography.

  3. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  4. REPREL computer code: users guide

    SciTech Connect

    Eslinger, P.W.; Sagar, B.

    1985-06-01

    A major part of the current Basalt Waste Isolation Project (BWIP) effort is directed at analyzing postclosure repository performance. These analyses will determine how well the proposed repository system achieves its design objectives and, in turn, how well the system complies with technical criteria and standards set by federal agencies. One of the performance measures is the fractional rate of mass release at the immediate boundary of the collection of waste containers. Estimation of radionuclide releases from a geologic repository at the boundary of the waste packages is required for two reasons: (1) to judge whether the engineered barrier system complies with the performance regulations prescribed by the US Nuclear Regulatory Commission and the US Environmental Protection Agency and (2) to obtain the value of radionuclide source term needed to model mass transport in the near and far fields of the repository. The REPREL computer code provides a tool for integrating the mass release over the repository based on a random sequence of container failure times as a function of corrosion and the expected release from a single container failing at a given time. The original version of REPREL was developed by BCS Richland, Inc. (BCSR) for use by Rockwell Hanford Operations (Rockwell) at the BWIP site. The original version is currently in use and is designated as version 1.0. The REPREL code is a stand-alone FORTRAN program, features and options of which are machine independent. This document is intended to serve as a guide to operation of version 1.0 of REPREL and is tailored to the code documentation specification suggested by Silling (1983). The mathematical model embodied by REPREL is summarized in 2.0, while detailed instructions for operation of the code are provided in 3.0, and a nomenclature list that defines all terms is provided in 4.0. A detailed treatment of an illustrative example problem is given in Appendix B. 7 refs.

  5. Research of independent component analysis

    Microsoft Academic Search

    Xianchuan Yu; Xiaochun Cheng; Y. Fu; J. Zhou; H. Hao; X. Yang; H. Huang; T. Zhang; L. Fang

    2004-01-01

    Independent component analysis (ICA) is a statistical technique to decompose multivariate data into statistically independent components. It could be applied to mine data of medical, economy or telecommunication systems, and to analyze data of GIS systems for agriculture or environment applications. To solve the problem of blind source separation, this paper introduces the theory and developments of ICA. The analyses

  6. Mars Program Independent Assessment Team

    E-print Network

    Leveson, Nancy

    Mars Program Independent Assessment Team Summary Report March 14, 2000 #12;2 Mars Climate Orbiter failed to achieve Mars orbit on September 23, 1999. On December 3, 1999, Mars Polar Lander and two Deep Space 2 microprobes failed. As a result, the NASA Administrator established the Mars Program Independent

  7. Conflict in Independent Catholic Schools

    ERIC Educational Resources Information Center

    Guernsey, Dan; Barott, James

    2008-01-01

    Independent Catholic schools are a growing phenomenon in the Catholic Church in America. This article provides a contextualized account of the phenomenon by examining via a field observation the experience of two independent Catholic schools in two different dioceses. These schools were founded in conflict and beset by continued conflict to the…

  8. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian [Sandia National Labs., Albuquerque, NM (United States); Murfin, W.B. [Technadyne Engineering Consultants, Inc., Albuquerque, NM (United States); Johnson, J.D. [Science Applications International Corp., Albuquerque, NM (United States)

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  9. The Electromagnetic Code Consortium

    NASA Astrophysics Data System (ADS)

    Faison, Joseph C.

    1990-02-01

    A decision was made in 1987 to consolidate radar cross section (RCS) code development sponsored by the U.S. armed services and NASA. An RCS code consortium was formed, consisting of a government steering group and members from the industrial/academic community. Since the formation of the consortium, significant progress has been made to advance code development work sponsored by the U.S. government. This paper is intended to make the RCS community aware of the Electromagnetic Code Consortium, so that potential contributors to code development can become involved with its work. It covers the approach taken by the consortium, the acquisition of a government-owned geometry code, validation, language and documentation, the support contractor, a survey of industry codes, and benchmarking.

  10. Light curves for bump Cepheids computed with a dynamically zoned pulsation code

    NASA Astrophysics Data System (ADS)

    Adams, T. F.; Castor, J. I.; Davis, C. G.

    1980-05-01

    The dynamically zoned pulsation code developed by Castor, Davis, and Davison was used to recalculate the Goddard model and to calculate three other Cepheid models with the same period (9.8 days). This family of models shows how the bumps and other features of the light and velocity curves change as the mass is varied at constant period. The use of a code that is capable of producing reliable light curves demonstrates that the light and velocity curves for 9.8 day Cepheid models with standard homogeneous compositions do not show bumps like those that are observed unless the mass is significantly lower than the 'evolutionary mass.' The light and velocity curves for the Goddard model presented here are similar to those computed independently by Fischel, Sparks, and Karp. They should be useful as standards for future investigators.

  11. Combined turbo coding and hierarchical QAM for unequal error protection of H.264 coded video

    Microsoft Academic Search

    B. Barmada; Mohammad Mahdi Ghandi; E. V. Jones; M. Ghanbari

    2006-01-01

    This paper investigates the unequal error protected (UEP) transmission of scalable H.264 bitstreams with two-priority layers, where differentiated turbo coding provides better protection for the high priority (HP) base layer than for the low priority (LP) enhancement layer. The drawback of such a method is the high overhead introduced by the channel coding, which results in a low source data

  12. Distributed detection and coding in information networks

    E-print Network

    Ho, Shan-Yuan

    2006-01-01

    This thesis investigates the distributed information and detection of a binary source through a parallel system of relays. Each relay observes the source output through a noisy channel, and the channel outputs are independent ...

  13. Multiplexed quantification for data-independent acquisition.

    PubMed

    Minogue, Catherine E; Hebert, Alexander S; Rensvold, Jarred W; Westphall, Michael S; Pagliarini, David J; Coon, Joshua J

    2015-03-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS(2) spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS(1)-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  14. Optimal source codes for geometrically distributed integer alphabets

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  15. The reliability of the functional independence measure: A quantitative review

    Microsoft Academic Search

    Kenneth J. Ottenbacher; Yungwen Hsu; Carl V. Granger; Roger C. Fiedler

    1996-01-01

    Objective: The reliability of the Functional Independence Measure (FIMSM) for adults was examined using procedures of meta-analysis.Data Sources: Eleven published studies reporting estimates of reliability for the FIM were located using computer searches of Index Medicus, Psychological Abstracts, the Functional Assessment Information Service, and citation tracking.Study Selection: Studies were identified and coded based on type of reliability (interrater, test-retest, or

  16. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  17. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, S.K.

    1998-03-24

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

  18. Universal Features for the Classification of Coding and Non-coding DNA Sequences

    PubMed Central

    Carels, Nicolas; Vidal, Ramon; Frías, Diego

    2009-01-01

    In this report, we revisited simple features that allow the classification of coding sequences (CDS) from non-coding DNA. The spectrum of codon usage of our sequence sample is large and suggests that these features are universal. The features that we investigated combine (i) the stop codon distribution, (ii) the product of purine probabilities in the three positions of nucleotide triplets, (iii) the product of Cytosine, Guanine, Adenine probabilities in 1st, 2nd, 3rd position of triplets, respectively, (iv) the product of G and C probabilities in 1st and 2nd position of triplets. These features are a natural consequence of the physico-chemical properties of proteins and their combination is successful in classifying CDS and non-coding DNA (introns) with a success rate >95% above 350 bp. The coding strand and coding frame are implicitly deduced when the sequences are classified as coding. PMID:20140069

  19. Population coding of affect across stimuli, modalities and individuals

    PubMed Central

    Chikazoe, Junichi; Lee, Daniel H.; Kriegeskorte, Nikolaus; Anderson, Adam K.

    2014-01-01

    It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population level activity evoked by complex scenes and basic tastes uncovered a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. While ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC), maintained a valence code independent of sensory origin. Further only the OFC code could classify experienced affect across participants. The entire valence spectrum is represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities, and people. PMID:24952643

  20. Multiple description image coding based on Lagrangian rate allocation.

    PubMed

    Tillo, Tammam; Grangetto, Marco; Olmo, Gabriella

    2007-03-01

    In this paper, a novel multiple description coding technique is proposed, based on optimal Lagrangian rate allocation. The method assumes the coded data consists of independently coded blocks. Initially, all the blocks are coded at two different rates. Then blocks are split into two subsets with similar rate distortion characteristics; two balanced descriptions are generated by combining code blocks belonging to the two subsets encoded at opposite rates. A theoretical analysis of the approach is carried out, and the optimal rate distortion conditions are worked out. The method is successfully applied to the JPEG 2000 standard and simulation results show a noticeable performance improvement with respect to state-of-the art algorithms. The proposed technique enables easy tuning of the required coding redundancy. Moreover, the generated streams are fully compatible with Part 1 of the standard. PMID:17357728

  1. Research on Universal Combinatorial Coding

    PubMed Central

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  2. Mechanical code comparator

    DOEpatents

    Peter, Frank J. (Albuquerque, NM); Dalton, Larry J. (Bernalillo, NM); Plummer, David W. (Albuquerque, NM)

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  3. Updating the Read Codes

    PubMed Central

    Robinson, David; Comp, Dip; Schulz, Erich; Brown, Philip; Price, Colin

    1997-01-01

    Abstract The Read Codes are a hierarchically-arranged controlled clinical vocabulary introduced in the early 1980s and now consisting of three maintained versions of differing complexity. The code sets are dynamic, and are updated quarterly in response to requests from users including clinicians in both primary and secondary care, software suppliers, and advice from a network of specialist healthcare professionals. The codes' continual evolution of content, both across and within versions, highlights tensions between different users and uses of coded clinical data. Internal processes, external interactions and new structural features implemented by the NHS Centre for Coding and Classification (NHSCCC) for user interactive maintenance of the Read Codes are described, and over 2000 items of user feedback episodes received over a 15-month period are analysed. PMID:9391934

  4. Validation of the reactor dynamics code TRAB

    NASA Astrophysics Data System (ADS)

    Raety, Hanna; Kyrki-Rajamaeki, Riitta; Rajamaeki, Markku

    1991-05-01

    The validation of the one dimensional reactor dynamics code TRAB (Transient Analysis code for BWR's) is summarized. TRAB was validated with benchmark problems, comparative calculations against independent analyses, analyses of start up experiments of nuclear power plants, and real plant transients. The initial power excursion of the Chernobyl reactor accident was calculated with TRAB. TRAB was originally designed for BWR analyses, but it can in its present version be used for various modeling purposes. The core model of TRAB can be used separately for LWR calculations. For PWR modeling the core model of TRAB was coupled to circuit model SMABRE to form the SMATRA code. The versatile modeling capabilities of TRAB were used in analyses of e.g., the heating reactor SECURE and the RBMK type reactor (Chernobyl).

  5. Benchmarking the democritus code

    Microsoft Academic Search

    N. Arinaminpathy; C. Fichtl; G. Lapenta; G. L. Delzanno

    2006-01-01

    Summary form only given. The DEMOCRITUS code is a particle-based code for plasma-material interaction simulation. The code makes use of particle-in-cell (PIC) method to simulate each plasma species, the material, and their interaction. In this study, we concentrate on a dust particle immersed in a plasma. We start with the simplest case, in which the dust particle is not allowed

  6. Gauge Color Codes

    E-print Network

    H. Bombin

    2014-12-16

    Color codes are topological stabilizer codes with unusual transversality properties. Here I show that their group of transversal gates only depends on the spatial dimension, not the local geometry. I also introduce a generalized, gauge version of color codes. In 3D they allow the transversal implementation of a universal set of gates by gauge fixing, while error-dectecting measurements involve only 4 or 6 qubits.

  7. Distributed Video Coding

    Microsoft Academic Search

    Bernd Girod; ANNE MARGOT AARON; Shantanu Rane; David Rebollo-Monedero

    2005-01-01

    Distributed coding is a new paradigm for video compression, based on Slepian and Wolf's and Wyner and Ziv's information-theoretic results from the 1970s. This paper reviews the recent development of practical distributed video coding schemes. Wyner-Ziv coding, i.e., lossy compression with receiver side information, enables low-complexity video encoding where the bulk of the computation is shifted to the decoder. Since

  8. Codes of Ethics Online

    NSDL National Science Digital Library

    The Center for the Study of Ethics in the Professions at the Illinois Institute of Technology maintains the Codes of Ethics Online Web site. The Center writes: "With the advent of the Internet, it seemed clear that digitizing the codes and making them accessible over the World-Wide Web would benefit researchers, students, and professionals alike." The science page contains links to over fifty organizations' ethical codes, including the American Institute of Chemists, the American Physical Society, the Water Quality Association, etc.

  9. Green Construction Codes

    E-print Network

    Blake, S.

    2011-01-01

    , Texas, Nov. 7 ? 9, 2011 Houston Code Enforcement ? Permits Sold FY11 = 108,301 ? Staff = 381 ? Inspections Performed FY11 = 614,925 ? Construction Valuation FY11 = $1.86 B ? Services ? Plan Review: 11 day ? Inspections: Next day ? Licensing... Participants ? Lights Out Houston ESL-KT-11-11-41 CATEE 2011, Dallas, Texas, Nov. 7 ? 9, 2011 Code Adoption in Houston TYPICAL CODE CYCLE - 3 YEAR ? Step 1: Building Official Request to CIC ? Step 2: Construction Industry Council Committees ? AIA...

  10. The Gli code

    PubMed Central

    Ruiz i Altaba, Ariel; Mas, Christophe; Stecca, Barbara

    2008-01-01

    The Gli code hypothesis postulates that the three vertebrate Gli transcription factors act together in responding cells to integrate intercellular Hedgehog (Hh) and other signaling inputs, resulting in the regulation of tissue pattern, size and shape. Hh and other inputs are then just ways to modify the Gli code. Recent data confirm this idea and suggest that the Gli code regulates stemness and also tumor progression and metastatic growth, opening exciting possibilities for both regenerative medicine and novel anticancer therapies. PMID:17845852

  11. Topological subsystem codes

    SciTech Connect

    Bombin, H. [Department of Physics, Massachusetts Institute of Technology, Cambridge, Massachusetts, 02139 (United States) and Perimeter Institute for Theoretical Physics, 31 Caroline St. N., Waterloo, Ontario N2L 2Y5 (Canada)

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  12. Resistor Color-Code

    NSDL National Science Digital Library

    "Resistor manufactures implement the standard EIA color-code using three, four and five color bands to identify nominal resistor values. It is imperative that engineers and technicians know how to interpret the color markings on resistors in order to perform analysis and repairs on electronic products." On this page, visitors will find a key to the code for three, four, and five band resistors and exercises to check for understanding. A Resistor Color-Code chart can also be downloaded and printed from this site, as well as a Resistor Color-Code Converter.

  13. Ideology Among Independent Voter Groups

    E-print Network

    Berry, Meagan

    2007-07-14

    A Senior Honors Thesis by MEAGAN BERRY IDEOLOGY AMONG INDEPENDENT VOTER GROUPS Submitted to the Office of Honors Programs & Academic Scholarships Texas A&M University In partial fulfillment of the requirements... of the UNIVERSITY UNDERGRADUATE RESEARCH FELLOWS April 2007 Majors: Political Science and Economics ii ABSTRACT Ideology Among Independent Voter Groups (April 2007) Meagan Berry Department of Political Science Texas A&M University...

  14. Parallel CARLOS-3D code development

    SciTech Connect

    Putnam, J.M. [McDonnell Douglas Corp., St. Louis, MO (United States); Kotulski, J.D. [Sandia National Labs., Albuquerque, NM (United States)

    1996-02-01

    CARLOS-3D is a three-dimensional scattering code which was developed under the sponsorship of the Electromagnetic Code Consortium, and is currently used by over 80 aerospace companies and government agencies. The code has been extensively validated and runs on both serial workstations and parallel super computers such as the Intel Paragon. CARLOS-3D is a three-dimensional surface integral equation scattering code based on a Galerkin method of moments formulation employing Rao- Wilton-Glisson roof-top basis for triangular faceted surfaces. Fully arbitrary 3D geometries composed of multiple conducting and homogeneous bulk dielectric materials can be modeled. This presentation describes some of the extensions to the CARLOS-3D code, and how the operator structure of the code facilitated these improvements. Body of revolution (BOR) and two-dimensional geometries were incorporated by simply including new input routines, and the appropriate Galerkin matrix operator routines. Some additional modifications were required in the combined field integral equation matrix generation routine due to the symmetric nature of the BOR and 2D operators. Quadrilateral patched surfaces with linear roof-top basis functions were also implemented in the same manner. Quadrilateral facets and triangular facets can be used in combination to more efficiently model geometries with both large smooth surfaces and surfaces with fine detail such as gaps and cracks. Since the parallel implementation in CARLOS-3D is at high level, these changes were independent of the computer platform being used. This approach minimizes code maintenance, while providing capabilities with little additional effort. Results are presented showing the performance and accuracy of the code for some large scattering problems. Comparisons between triangular faceted and quadrilateral faceted geometry representations will be shown for some complex scatterers.

  15. Turbo-coded APSK for aeronautical telemetry

    Microsoft Academic Search

    Christopher Shaw; Michael Rice

    2009-01-01

    The performance of turbo-coded amplitude-phase shift keying (APSK) in an aeronautical telemetry system with a non-linear power amplifier is investigated. The AM\\/AM curves for four L-band power amplifiers are modeled and used to simulate the performance the turbo-coded APSK system over a nonlinear channel. Spectral efficiency and bit error rate performance as a function of back-off are quantified and compared

  16. Turbo-Coded APSK for aeronautical telemetry

    Microsoft Academic Search

    Christopher Shaw; Michael Rice

    2010-01-01

    The performance of turbo-coded Amplitude-Phase Shift Keying (APSK) in an aeronautical telemetry system with a non-linear power amplifier is investigated. The AM\\/AM curves for four L-band power amplifiers are modeled and used to simulate the performance the turbo-coded APSK system over a non-linear channel, Spectral efficiency and bit error rate performance as a function of back-off are quantified and compared

  17. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Shu, L.; Kasami, T.

    1985-01-01

    A cascade coding scheme for error control is investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are evaluated. They seem to be quite suitable for satellite down-link error control.

  18. The College Station Residential Energy Compliance Code

    E-print Network

    Claridge, D. E.; Schrock, D.

    1988-01-01

    an investigation of the thermal characteristics of residential construction in College Station for 1981-1986. The major results of that study were presented earlier [2]. It was found, based on aample inspections, that Code enforcement was thorough: the code... key groups. These reaT included the College Station her6 ~ivision, the College Station Energy nanagement Committee, the BryanKollege Station Home Builders Association, the College Station Building Inspection Division, a local HVAC contractor, Lone...

  19. Final Independent External Peer Review Report Independent External Peer Review (IEPR),

    E-print Network

    US Army Corps of Engineers

    Final Independent External Peer Review Report Independent External Peer Review (IEPR), Delta Independent External Peer Review Report Independent External Peer Review (IEPR), Delta Islands and Levees External Peer Review Report Independent External Peer Review (IEPR), Delta Islands and Levees Feasibility

  20. Qudit Colour Codes and Gauge Colour Codes in All Spatial Dimensions

    E-print Network

    Fern H. E. Watson; Earl T. Campbell; Hussain Anwar; Dan E. Browne

    2015-03-30

    Two-level quantum systems, qubits, are not the only basis for quantum computation. Advantages exist in using qudits, d-level quantum systems, as the basic carrier of quantum information. We show that colour codes---a class of topological quantum codes with remarkable transversality properties---can be generalised to the qudit paradigm. In recent developments it was found that in three spatial dimensions a qubit colour code can support a transversal non-Clifford gate, and that in higher spatial dimensions additional non-Clifford gates can be found, saturating Bravyi and K\\"onig's bound [Phys. Rev. Lett. 110, 170503 (2013)]. Furthermore, by using gauge fixing techniques, an effective set of Clifford gates can be achieved, removing the need for state distillation. We show that the qudit colour code can support the qudit analogues of these gates, and show that in higher spatial dimensions a colour code can support a phase gate from higher levels of the Clifford hierarchy which can be proven to saturate Bravyi and K\\"onig's bound in all but a finite number of special cases. The methodology used is a generalisation of Bravyi and Haah's method of triorthogonal matrices [Phys. Rev. A 86 052329 (2012)], which may be of independent interest. For completeness, we show explicitly that the qudit colour codes generalise to gauge colour codes, and share the many of the favourable properties of their qubit counterparts.

  1. The Sign Rule and Beyond: Boundary Effects, Flexibility, and Noise Correlations in Neural Population Codes

    PubMed Central

    Hu, Yu; Zylberberg, Joel; Shea-Brown, Eric

    2014-01-01

    Over repeat presentations of the same stimulus, sensory neurons show variable responses. This “noise” is typically correlated between pairs of cells, and a question with rich history in neuroscience is how these noise correlations impact the population's ability to encode the stimulus. Here, we consider a very general setting for population coding, investigating how information varies as a function of noise correlations, with all other aspects of the problem – neural tuning curves, etc. – held fixed. This work yields unifying insights into the role of noise correlations. These are summarized in the form of theorems, and illustrated with numerical examples involving neurons with diverse tuning curves. Our main contributions are as follows. (1) We generalize previous results to prove a sign rule (SR) — if noise correlations between pairs of neurons have opposite signs vs. their signal correlations, then coding performance will improve compared to the independent case. This holds for three different metrics of coding performance, and for arbitrary tuning curves and levels of heterogeneity. This generality is true for our other results as well. (2) As also pointed out in the literature, the SR does not provide a necessary condition for good coding. We show that a diverse set of correlation structures can improve coding. Many of these violate the SR, as do experimentally observed correlations. There is structure to this diversity: we prove that the optimal correlation structures must lie on boundaries of the possible set of noise correlations. (3) We provide a novel set of necessary and sufficient conditions, under which the coding performance (in the presence of noise) will be as good as it would be if there were no noise present at all. PMID:24586128

  2. Axisymmetric generalized harmonic evolution code

    SciTech Connect

    Sorkin, Evgeny [Max Planck Institute for Gravitational Physics (Albert Einstein Institute), Am Muehlenberg 1, D-14476, Golm (Germany)

    2010-04-15

    We describe the first axisymmetric numerical code based on the generalized harmonic formulation of the Einstein equations, which is regular at the axis. We test the code by investigating gravitational collapse of distributions of complex scalar field in a Kaluza-Klein spacetime. One of the key issues of the harmonic formulation is the choice of the gauge source functions, and we conclude that a damped-wave gauge is remarkably robust in this case. Our preliminary study indicates that evolution of regular initial data leads to formation both of black holes with spherical and cylindrical horizon topologies. Intriguingly, we find evidence that near threshold for black hole formation the number of outcomes proliferates. Specifically, the collapsing matter splits into individual pulses, two of which travel in the opposite directions along the compact dimension and one which is ejected radially from the axis. Depending on the initial conditions, a curvature singularity develops inside the pulses.

  3. Company profile: Big changes revive independent`s profits

    SciTech Connect

    Tippee, B.

    1996-11-04

    In 4 years` time, American Exploration has changed from an aggressive acquirer and manager of producing properties for institutional investors into a geographically focused independent producer dedicated to making money by finding and producing oil and gas. Through its adaptations to unexpectedly stagnant oil prices, American Exploration reflects the type of top-to-bottom changes many independent producers have made to survive a brutal decade. It also demonstrates that an independent producer can prosper in the absence of ever-rising prices: the company reported net income of $3.9 million last year following a $54.8 million loss--much of it related to an accounting change--in 1994 and a string of losses before that. In an interview with Oil and Gas Journal, Andrews discussed his company`s transformation and financial turnaround, his new appreciation for the balance between capital and technology, and future directions of his company and industry.

  4. Call that code! Simulated code practices.

    PubMed

    Ashby, L; Shepherd, B

    1990-01-01

    This article deals with the planning and implementation of simulated code blue practice sessions for licensed staff. Use of situational role play utilizing the actual equipment is emphasized. Strategies for implementing a program as well as strengths and weaknesses of one such program are discussed. PMID:2310470

  5. Pulse Code Modulation

    Microsoft Academic Search

    H. S. Black; J. O. Edson

    1947-01-01

    A radically new modulation technique for multichannel telephony has been developed which involves the conversion of speech waves into coded pulses. An 8-channel system employing pulse code modulation (PCM) and embodying these principles was produced. The method appears to have exceptional possibilities from the standpoint of freedom from interference, but its full significance in connection with future radio and wire

  6. Prostate Surgery Codes

    Cancer.gov

    Prostate C619 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Do not code an orchiectomy in this field. For prostate primaries, orchiectomies are coded in the data item “Hematologic Transplant and

  7. Cervix Uteri Surgery Codes

    Cancer.gov

    Cervi x Uteri C530–C539 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) [SEER Note: Do not code dilation and curettage (D&C) as Surgery of Primary Site for invasive cancers] Codes 00 None; no surgery

  8. Corpus Uteri Surgery Codes

    Cancer.gov

    Corpus Uteri C540–C559 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) [SEER Note: Do not code dilation and curettage (D&C) as Surgery of Primary Site for invasive cancers] Codes 00 None; no surgery

  9. Anus Surgery Codes

    Cancer.gov

    Anus C210–C218 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) [SEER Note: Do not code infrared coagulation as treatment.] Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor

  10. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  11. A Turbo Code Tutorial

    Microsoft Academic Search

    William E. Ryan

    1997-01-01

    We give a tutorial exposition of turbo codes and the associated algorithms. Included are a simple derivation for the performance of turbo codes, and a straightforward presentation of the iterative decoding algorithm. The derivations of both the performance estimate and the modified BCJR decoding algorithm are novel. The treatment is intended to be a launching point for further study in

  12. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  13. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  14. RFQ simulation code

    SciTech Connect

    Lysenko, W.P.

    1984-04-01

    We have developed the RFQLIB simulation system to provide a means to systematically generate the new versions of radio-frequency quadrupole (RFQ) linac simulation codes that are required by the constantly changing needs of a research environment. This integrated system simplifies keeping track of the various versions of the simulation code and makes it practical to maintain complete and up-to-date documentation. In this scheme, there is a certain standard version of the simulation code that forms a library upon which new versions are built. To generate a new version of the simulation code, the routines to be modified or added are appended to a standard command file, which contains the commands to compile the new routines and link them to the routines in the library. The library itself is rarely changed. Whenever the library is modified, however, this modification is seen by all versions of the simulation code, which actually exist as different versions of the command file. All code is written according to the rules of structured programming. Modularity is enforced by not using COMMON statements, simplifying the relation of the data flow to a hierarchy diagram. Simulation results are similar to those of the PARMTEQ code, as expected, because of the similar physical model. Different capabilities, such as those for generating beams matched in detail to the structure, are available in the new code for help in testing new ideas in designing RFQ linacs.

  15. Operational RNA code for amino acids: species-specific aminoacylation of minihelices switched by a single nucleotide.

    PubMed Central

    Hipps, D; Shiba, K; Henderson, B; Schimmel, P

    1995-01-01

    The genetic code is based on aminoacylation reactions where specific amino acids are attached to tRNAs bearing anticodon trinucleotides. However, the anticodon-independent specific aminoacylation of RNA minihelix substrates by bacterial and yeast tRNA synthetases suggested an operational RNA code for amino acids whereby specific RNA sequences/structures in tRNA acceptor stems correspond to specific amino acids. Because of the possible significance of the operational RNA code for the development of the genetic code, we investigated aminoacylation of synthetic RNA minihelices with a human enzyme to understand the sequences needed for that aminoacylation compared with those needed for a microbial system. We show here that the species-specific aminoacylation of glycine tRNAs is recapitulated by a species-specific aminoacylation of minihelices. Although the mammalian and Escherichia coli minihelices differ at 6 of 12 base pairs, two of the three nucleotides essential for aminoacylation by the E. coli enzyme are conserved in the mammalian minihelix. The two conserved nucleotides were shown to be also important for aminoacylation of the mammalian minihelix by the human enzyme. A simple interchange of the differing nucleotide enabled the human enzyme to now charge the bacterial substrate and not the mammalian minihelix. Conversely, this interchange made the bacterial enzyme specific for the mammalian substrate. Thus, the positional locations (if not the actual nucleotides) for the operational RNA code for glycine appear conserved from bacteria to mammals. Images Fig. 1 PMID:7539919

  16. Operational RNA code for amino acids: species-specific aminoacylation of minihelices switched by a single nucleotide.

    PubMed

    Hipps, D; Shiba, K; Henderson, B; Schimmel, P

    1995-06-01

    The genetic code is based on aminoacylation reactions where specific amino acids are attached to tRNAs bearing anticodon trinucleotides. However, the anticodon-independent specific aminoacylation of RNA minihelix substrates by bacterial and yeast tRNA synthetases suggested an operational RNA code for amino acids whereby specific RNA sequences/structures in tRNA acceptor stems correspond to specific amino acids. Because of the possible significance of the operational RNA code for the development of the genetic code, we investigated aminoacylation of synthetic RNA minihelices with a human enzyme to understand the sequences needed for that aminoacylation compared with those needed for a microbial system. We show here that the species-specific aminoacylation of glycine tRNAs is recapitulated by a species-specific aminoacylation of minihelices. Although the mammalian and Escherichia coli minihelices differ at 6 of 12 base pairs, two of the three nucleotides essential for aminoacylation by the E. coli enzyme are conserved in the mammalian minihelix. The two conserved nucleotides were shown to be also important for aminoacylation of the mammalian minihelix by the human enzyme. A simple interchange of the differing nucleotide enabled the human enzyme to now charge the bacterial substrate and not the mammalian minihelix. Conversely, this interchange made the bacterial enzyme specific for the mammalian substrate. Thus, the positional locations (if not the actual nucleotides) for the operational RNA code for glycine appear conserved from bacteria to mammals. PMID:7539919

  17. Seismic analysis of piping systems subjected to independent-support excitation

    SciTech Connect

    Subudhi, M.; Bezler, P.

    1983-01-01

    This paper presents a comparison of dynamic responses of piping systems subject to independent-support excitation using the response spectrum and time-history methods. The BNL finite-element computer code PSAFE2 has been used to perform all the analyses. The time-history method combines both the inertia as well as static effect on the piping responses due to independent-support excitations at each time point, thus representing the actual responses. A sample problem is analyzed subjected to two independent support excitations and the results are presented in comparison with the response spectrum methods with uniform or independent-support motion.

  18. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  19. Evolving genetic code

    PubMed Central

    OHAMA, Takeshi; INAGAKI, Yuji; BESSHO, Yoshitaka; OSAWA, Syozo

    2008-01-01

    In 1985, we reported that a bacterium, Mycoplasma capricolum, used a deviant genetic code, namely UGA, a “universal” stop codon, was read as tryptophan. This finding, together with the deviant nuclear genetic codes in not a few organisms and a number of mitochondria, shows that the genetic code is not universal, and is in a state of evolution. To account for the changes in codon meanings, we proposed the codon capture theory stating that all the code changes are non-disruptive without accompanied changes of amino acid sequences of proteins. Supporting evidence for the theory is presented in this review. A possible evolutionary process from the ancient to the present-day genetic code is also discussed. PMID:18941287

  20. Associations between children’s independent mobility and physical activity

    PubMed Central

    2014-01-01

    Background Independent mobility describes the freedom of children to travel and play in public spaces without adult supervision. The potential benefits for children are significant such as social interactions with peers, spatial and traffic safety skills and increased physical activity. Yet, the health benefits of independent mobility, particularly on physical activity accumulation, are largely unexplored. This study aimed to investigate associations of children’s independent mobility with light, moderate-to-vigorous, and total physical activity accumulation. Methods In 2011 - 2012, 375 Australian children aged 8-13 years (62% girls) were recruited into a cross-sectional study. Children’s independent mobility (i.e. independent travel to school and non-school destinations, independent outdoor play) and socio-demographics were assessed through child and parent surveys. Physical activity intensity was measured objectively through an Actiheart monitor worn on four consecutive days. Associations between independent mobility and physical activity variables were analysed using generalized linear models, accounting for clustered sampling, Actiheart wear time, socio-demographics, and assessing interactions by sex. Results Independent travel (walking, cycling, public transport) to school and non-school destinations were not associated with light, moderate-to-vigorous and total physical activity. However, sub-analyses revealed a positive association between independent walking and cycling (excluding public transport) to school and total physical but only in boys (b?=?36.03, p?independent outdoor play (three or more days per week) was positively associated with light and total physical activity (b?=?29.76, p?independent outdoor play and moderate-to-vigorous physical activity. When assessing differences by sex, the observed significant associations of independent outdoor play with light and total physical activity remained in girls but not in boys. All other associations showed no significant differences by sex. Conclusions Independent outdoor play may boost children’s daily physical activity levels, predominantly at light intensity. Hence, facilitating independent outdoor play could be a viable intervention strategy to enhance physical activity in children, particularly in girls. Associations between independent travel and physical activity are inconsistent overall and require further investigation. PMID:24476363

  1. An eye-tracking study of how color coding affects multimedia learning

    Microsoft Academic Search

    Erol Ozcelik; Türkan Karakus; Engin Kursun; Kursat Cagiltay

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or conventional format of multimedia instruction. Eye movement data

  2. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  3. Experimental Measurement-Device-Independent Entanglement Detection

    NASA Astrophysics Data System (ADS)

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-02-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols.

  4. Experimental measurement-device-independent entanglement detection.

    PubMed

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  5. Experimental Measurement-Device-Independent Entanglement Detection

    PubMed Central

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  6. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  7. Local covariance and background independence

    E-print Network

    Klaus Fredenhagen; Katarzyna Rejzner

    2011-02-11

    One of the many conceptual difficulties in the development of quantum gravity is the role of a background geometry for the structure of quantum field theory. To some extent the problem can be solved by the principle of local covariance. The principle of local covariance was originally imposed in order to restrict the renormalization freedom for quantum field theories on generic spacetimes. It turned out that it can also be used to implement the request of background independence. Locally covariant fields then arise as background independent entities.

  8. Progress in cultivation-independent phyllosphere microbiology

    PubMed Central

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their ‘noncultivability’ and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4?years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human–pathogenic bacteria in plant food. PMID:24003903

  9. Report number codes

    SciTech Connect

    Nelson, R.N. (ed.)

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  10. New quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Wang, Liqi; Zhu, Shixin

    2015-03-01

    Quantum maximum-distance-separable (MDS) codes form an important class of quantum codes. It is very hard to construct quantum MDS codes with relatively large minimum distance. In this paper, based on classical constacyclic codes, we construct two classes of quantum MDS codes with parameters where , and with even, and where , and with odd. The quantum MDS codes exhibited here have parameters better than the ones available in the literature.

  11. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ? 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  12. Huffman coding in advanced audio coding standard

    NASA Astrophysics Data System (ADS)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  13. Quantum error control codes

    E-print Network

    Abdelhamid Awad Aly Ahmed, Sala

    2008-10-10

    valid codeword in the codespace [30]. Shor’s demonstrated the first quantum error correcting code [137]. The code encodes one qubit into nine qubits, and is able to correct for one error and detect two errors. Shortly Gottesman[58], Steane[144], and... is the added noise. Then one can use the matrix H to perform error correction and detection capabilities of the code C. s = rHT = (v+e)HT = eHT. (2.7) Based on the value of the syndrome s, one might be able to correct the received codeword r to the original...

  14. Coding with side information

    E-print Network

    Cheng, Szeming

    2005-11-01

    Engineering CODING WITH SIDE INFORMATION A Dissertation by SZE MING CHENG Submitted to Texas A&M University in partial fulflllment of the requirements for the degree of DOCTOR OF PHILOSOPHY Approved as to style and content by: Zixiang Xiong (Chair of Committee....S., University of Hong Kong; M.S., Hong Kong University of Science and Technology; M.S., University of Hawaii Chair of Advisory Committee: Zixiang Xiong Source coding and channel coding are two important problems in communi- cations. Although side information...

  15. Perceptually-Based Adaptive JPEG Coding

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

  16. Fault Tolerance with the Gauge Color Code

    E-print Network

    Benjamin J. Brown; Naomi H. Nickerson; Dan E. Browne

    2015-03-27

    The gauge color code is a quantum error-correcting code with local syndrome measurements that, remarkably, admits a universal transversal gate set without the need for resource-intensive magic state distillation. A result of recent interest, proposed by Bomb\\'{i}n, shows that the subsystem structure of the gauge color code admits an error-correction protocol that achieves tolerance to noisy measurements without the need for repeated measurements, so called single-shot error correction. Here, we demonstrate the promise of single-shot error correction by designing a two-part decoder and investigate its performance. We simulate fault-tolerant error correction with the gauge color code over long durations by repeatedly applying our proposed error-correction protocol. We estimate a sustainable error rate, i.e. the threshold for the long time limit, of $ \\sim 0.31\\%$ for a phenomenological noise model using a simple decoding algorithm.

  17. Time-Independent Gravitational Fields

    E-print Network

    Robert Beig; Bernd G. Schmidt

    2000-05-13

    This article reviews, from a global point of view, rigorous results on time independent spacetimes. Throughout attention is confined to isolated bodies at rest or in uniform rotation in an otherwise empty universe. The discussion starts from first principles and is, as much as possible, self-contained.

  18. IEAB Independent Economic Analysis Board

    E-print Network

    of Instream Water Supply Components of the Salmon Creek Project Independent Economic Analysis Board Northwest district to continue water delivery to its users. The Proposed Project includes (1) improved water control bank until the other project elements can be implemented. Most of the water needed to replace Salmon

  19. Odd Independent Transversals are Odd

    Microsoft Academic Search

    Penny E. Haxell; Tibor Szabó

    2006-01-01

    We put the nal piece into a puzzle rst introduced by Bollob as, Erd} os and Szemer edi in 1975. For arbitrary positive integers n and r we determine the largest integer = ( r; n), for which any r-partite graph with partite sets of size n and of maximum degree less than has an independent transversal. This value was

  20. INDEPENDENT SPENT FUEL STORAGE INSTALLATION

    E-print Network

    For The; Diablo Canyon

    The staff of the U.S. Nuclear Regulatory Commission (NRC) has prepared this supplement to the Environmental Assessment (EA) and draft finding of no significant impact (FONSI) for the Diablo Canyon Independent Spent Fuel Storage Installation (ISFSI), at the direction of the Commission, in response

  1. Using Independent Component Analysis to Separate Signals in Climate Data

    SciTech Connect

    Fodor, I K; Kamath, C

    2003-01-28

    Global temperature series have contributions from different sources, such as volcanic eruptions and El Nino Southern Oscillation variations. We investigate independent component analysis as a technique to separate unrelated sources present in such series. We first use artificial data, with known independent components, to study the conditions under which ICA can separate the individual sources. We then illustrate the method with climate data from the National Centers for Environmental Prediction.

  2. An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development

    NASA Astrophysics Data System (ADS)

    Haldeman, Charles Waldo, IV

    2003-10-01

    This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three-dimensional effects on the LPV surface are significant.

  3. Coding-Spreading Tradeoff in CDMA Systems

    NASA Astrophysics Data System (ADS)

    Bolas, Eduardo J.

    2002-09-01

    In this thesis we investigate the usage of low rate codes primarily to provide the total bandwidth expansion required for a Code Division Multiple Access (CDMA) combinations of coding and spreading with a traditional DS-CDMA, as defined in the IS-95 standard, allows the criteria to be defined for the best coding-spreading tradeoff in CDMA systems. The analysis of the coding-spreading tradeoff is divided into two parts. The first part is dedicated to the study of the deterministic components of the problem. This includes the different factors with non-random behavior that the system's designer can determine. The processing gain, the code characteristics and the number of users are well-defined variables that can determine the overall performance and can consequently affect the tradeoff. The second part of the study is dedicated to analyzing different combinations of coding and spreading with no ideal channel estimation and interference reduction techniques. Small-scale fading channel conditions are emulated through Nakagami-m distribution. Large-scale path loss was incorporated through the extended Hata model while Lognormal shadowing considered the fluctuations on the received power at points with the same distance to the transmitter. We assessed the performance of different combinations of coding and spreading considering in two cases: a worst-case scenario in which the mobile user was located at the corner of a hexagon cell in a seven-cell cluster and a more realistic scenario in which the user could be physically located anywhere in the cell, following a uniform probability distribution function. Furthermore, we investigated the improvement in performance generated by interference reduction techniques, such as sectoring and power control.

  4. Soil, An Environmental Investigation.

    ERIC Educational Resources Information Center

    National Wildlife Federation, Washington, DC.

    This environmental unit is one of a series designed for integration within an existing curriculum. The unit is self-contained and requires minimal teacher preparation. The philosophy of the series is based on an experience-oriented process that encourages self-paced independent student work. This particular unit investigates soil in relation to…

  5. The STAGS computer code

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.

    1978-01-01

    Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.

  6. Climate Code Foundation

    E-print Network

    Barnes, Nick; Jones, David

    2011-07-05

    Climate Code Foundation - who are we? A non-profit organisation founded in August 2010; our goal is to promote the public understanding of climate science, by increasing the visibility and clarity of the software used in climate science...

  7. FAST2 Code validation

    SciTech Connect

    Wilson, R.E.; Freeman, L.N.; Walker, S.N. [Oregon State Univ., Corvallis, OR (United States). Dept. of Mechanical Engineering

    1995-09-01

    The FAST2 Code which is capable of determining structural loads of a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data at two wind speeds for the ESI-80 are given. The FAST2 Code models a two-bladed HAWT with degrees of freedom for blade flap, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffness, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms and azimuth averaged bin plots. It is concluded that agreement between the FAST2 Code and test results is good.

  8. Topological subsystem codes

    E-print Network

    Bombin, Hector

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error ...

  9. No Code: Null Programs

    E-print Network

    Montfort, Nick

    2014-06-05

    To continue the productive discussion of uninscribed artworks in Craig Dworkin’s No Medium, this report discusses, in detail, those computer programs that have no code, and are thus empty or null. Several specific examples ...

  10. Electrocutaneous code pairs for artificial sensory communication systems.

    PubMed

    Szeto, A Y

    1982-01-01

    Pairs of electrocutaneous codes suitable for dual-channel sensory communication systems were compared using a dual-channel electrocutaneous tracking task. The tracking task required the test subject to dynamically respond to changes in the tactile sensation being modulated by two independent pseudorandom signals, one for each channel. The rule (or method) by which the signals changed the tactile sensations was called an electrocutaneous code. Four frequency variation codes and two intensity variation codes were paired in different combinations and then checked as to their effectiveness for sensory communications. The experimental protocol used a balanced incomplete block design which involved 24 subjects testing 3 of 8 code pairs each. Although the variance in the tracking performances between subjects was larger than the differences between the code pairs, learning rates for the various pairs were significantly different. The easiest one to learn was the Low Pulse Rate Modulation Code paired with itself. other findings included the general superiority of monophasic stimulation code pairs over biphasic stimulation code pairs, the need for placement of the two electrodes on different dermatomes in order to achieve satisfactory dual-channel communications, and the greater sensitivity to electrocutaneous stimulation of the ventral side of the forearm versus its dorsal side. PMID:7171152

  11. Automated searching for quantum subsystem codes

    SciTech Connect

    Crosswhite, Gregory M.; Bacon, Dave [Department of Physics, University of Washington, Seattle, Washington 98195 (United States)

    2011-02-15

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  12. Automated searching for quantum subsystem codes

    E-print Network

    Gregory M. Crosswhite; Dave Bacon

    2010-09-11

    Quantum error correction allows for faulty quantum systems to behave in an effectively error free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to 2-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  13. Hour of Code

    NSDL National Science Digital Library

    Engineers from Google, Microsoft, Facebook, and Twitter

    2013-01-01

    This website offers students an opportunity to complete a one hour computer coding tutorial. The tutorial includes twenty mazes that have videos dispersed between them to introduce new coding concepts, the videos include personal stories from Bill Gates and Mark Zuckerburg. Once students complete this introduction they can move on to the next level of challenge. Teacher lesson plans are included for use in the classroom.

  14. Subband coding of images

    Microsoft Academic Search

    JOHN W. WOODS; S. O'Neil

    1986-01-01

    Subband coding has become quite popular for the source encoding of speech. This paper presents a simple yet efficient extension of this concept to the source coding of images. We specify the constraints for a set of two-dimensional quadrature mirror filters (QMF's) for a particular frequency-domain partition, and show that these constraints are satisfied by a separable combination of one-dimensional

  15. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  16. Savannah River experience using a Cause Coding Tree to identify the root cause of an incident

    SciTech Connect

    Paradies, M.W.; Busch, D.A.

    1986-01-01

    Incidents (or near misses) provide important information about plant performance and ways to improve that performance. Any particular incident may have several ''root causes'' that need to be addressed to prevent recurrence of the incident and thereby improve the safety of the plant. Also, by reviewing a large number of these incidents, one can identify trends in the root causes and generic concerns. A method has been developed at Savannah River Plant to systematically evaluate incidents, identify their root causes, record these root causes, and analyze the trends of these causes. By providing a systematic method to identify correctable root causes, the system helps the incident investigator to ask the right questions during the investigation. It also provides the independent safety analysis group and management with statistics that indicate existing and developing trouble sports. This paper describes the Savannah River Plant (SRP) Cause Coding Tree, and the differences between the SRP Tree and other systems used to analyze incidents. 2 refs., 14 figs.

  17. Prediction of rho-independent transcriptional terminators in Escherichia coli.

    PubMed

    Lesnik, E A; Sampath, R; Levene, H B; Henderson, T J; McNeil, J A; Ecker, D J

    2001-09-01

    A new algorithm called RNAMotif containing RNA structure and sequence constraints and a thermodynamic scoring system was used to search for intrinsic rho-independent terminators in the Escherichia coli K-12 genome. We identified all 135 reported terminators and 940 putative terminator sequences beginning no more than 60 nt away from the 3'-end of the annotated transcription units (TU). Putative and reported terminators with the scores above our chosen threshold were found for 37 of the 53 non-coding RNA TU and for almost 50% of the 2592 annotated protein-encoding TU, which correlates well with the number of TU expected to contain rho-independent terminators. We also identified 439 terminators that could function in a bi-directional fashion, servicing one gene on the positive strand and a different gene on the negative strand. Approximately 700 additional termination signals in non-coding regions (NCR) far away from the nearest annotated gene were predicted. This number correlates well with the excess number of predicted 'orphan' promoters in the NCR, and these promoters and terminators may be associated with as yet unidentified TU. The significant number of high scoring hits that occurred within the reading frame of annotated genes suggests that either an additional component of rho-independent terminators exists or that a suppressive mechanism to prevent unwanted termination remains to be discovered. PMID:11522828

  18. Effects of additional independent noise in binary composite hypothesis-testing problems

    Microsoft Academic Search

    Suat Bayram; Sinan Gezici

    2009-01-01

    Performance of some suboptimal detectors can be improved by adding independent noise to their observations. In this paper, the effects of adding independent noise to observations of a detector are investigated for binary composite hypothesis-testing problems in a generalized Neyman-Pearson framework. Sufficient conditions are derived to determine when performance of a detector can or cannot be improved via additional independent

  19. KENO-V code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The KENO-V code is the current release of the Oak Ridge multigroup Monte Carlo criticality code development. The original KENO, with 16 group Hansen-Roach cross sections and P/sub 1/ scattering, was one ot the first multigroup Monte Carlo codes and it and its successors have always been a much-used research tool for criticality studies. KENO-V is able to accept large neutron cross section libraries (a 218 group set is distributed with the code) and has a general P/sub N/ scattering capability. A supergroup feature allows execution of large problems on small computers, but at the expense of increased calculation time and system input/output operations. This supergroup feature is activated automatically by the code in a manner which utilizes as much computer memory as is available. The primary purpose of KENO-V is to calculate the system k/sub eff/, from small bare critical assemblies to large reflected arrays of differing fissile and moderator elements. In this respect KENO-V neither has nor requires the many options and sophisticated biasing techniques of general Monte Carlo codes.

  20. Coded Adaptive Linear Precoded Discrete Multitone Over PLC Channel

    E-print Network

    Muhammad, Fahad Syed; Hélard, Jean-François; Crussière, Matthieu

    2008-01-01

    Discrete multitone modulation (DMT) systems exploit the capabilities of orthogonal subcarriers to cope efficiently with narrowband interference, high frequency attenuations and multipath fadings with the help of simple equalization filters. Adaptive linear precoded discrete multitone (LP-DMT) system is based on classical DMT, combined with a linear precoding component. In this paper, we investigate the bit and energy allocation algorithm of an adaptive LP-DMT system taking into account the channel coding scheme. A coded adaptive LPDMT system is presented in the power line communication (PLC) context with a loading algorithm which accommodates the channel coding gains in bit and energy calculations. The performance of a concatenated channel coding scheme, consisting of an inner Wei's 4-dimensional 16-states trellis code and an outer Reed-Solomon code, in combination with the proposed algorithm is analyzed. Theoretical coding gains are derived and simulation results are presented for a fixed target bit error ra...

  1. Quantum error-correcting codes over mixed alphabets

    NASA Astrophysics Data System (ADS)

    Wang, Zhuo; Yu, Sixia; Fan, Heng; Oh, C. H.

    2013-08-01

    We study the quantum error-correcting codes over mixed alphabets to deal with a more complicated and practical situation in which the physical systems for encoding may have different numbers of energy levels. In particular we investigate their constructions and propose the theory of quantum Singleton bound. Two kinds of code constructions are presented: a projection-based construction for general case and a graphical construction based on a graph-theoretical object composite coding clique dealing with the case of reducible alphabets. We find out some optimal one-error correcting or detecting codes over two alphabets. Our method of composite coding clique also sheds light on constructing standard quantum error-correcting codes, and other families of optimal codes are found.

  2. Ubiquitin-Independent Proteasomal Degradation

    PubMed Central

    Erales, Jenny; Coffino, Philip

    2013-01-01

    Most proteasome substrates are marked for degradation by ubiquitin conjugation, but some are targeted by other means. The properties of these exceptional cases provide insights into the general requirements for proteasomal degradation. Here the focus is on three ubiquitin-independent substrates that have been the subject of detailed study. These are Rpn4, a transcriptional regulator of proteasome homeostasis, thymidylate synthase, an enzyme required for production of DNA precursors and ornithine decarboxylase, the initial enzyme committed to polyamine biosynthesis. It can be inferred from these cases that proteasome association and the presence of an unstructured region are the sole prerequisites for degradation. Based on that inference, artificial substrates have been designed to test the proteasome's capacity for substrate processing and its limitations. Ubiquitin-independent substrates may in some cases be a remnant of the pre-ubiquitome world, but in other cases could provide optimized regulatory solutions. PMID:23684952

  3. Bit-Interleaved Coded Multiple Beamforming with Perfect Coding

    E-print Network

    Ayanoglu, Ender

    , or to enhance the performance [3]. However, spatial multiplexing without channel coding results in the loss and full multiplexing. When channel coding is added, both of them can be achieved as long as the code rate-PC), is introduced. BICMB-PC transmits PSTBCs through convolutional coded SVD systems. Similarly to BICMB-FP, BICMB

  4. Representing Group Codes as Permutation Codes Ezio Biglieri

    E-print Network

    Karlof, John K.

    the usual definition of equivalent codes. We show that every group code is (weakly) equivalent components each). Define the number of bits per dimension carried by the constellation as r = log2 M N.g., "orbit codes" [8]) instead of Slepian's "group codes."). We use here the original definition

  5. The impact of time step definition on code convergence and robustness

    NASA Technical Reports Server (NTRS)

    Venkateswaran, S.; Weiss, J. M.; Merkle, C. L.

    1992-01-01

    We have implemented preconditioning for multi-species reacting flows in two independent codes, an implicit (ADI) code developed in-house and the RPLUS code (developed at LeRC). The RPLUS code was modified to work on a four-stage Runge-Kutta scheme. The performance of both the codes was tested, and it was shown that preconditioning can improve convergence by a factor of two to a hundred depending on the problem. Our efforts are currently focused on evaluating the effect of chemical sources and on assessing how preconditioning may be applied to improve convergence and robustness in the calculation of reacting flows.

  6. Independent Components of Magnetoencephalography: Localization

    Microsoft Academic Search

    Akaysha C. Tang; Barak A. Pearlmutter; Natalie A. Malaszenko; Dan B. Phung; Bethany C. Reeb

    2002-01-01

    We applied second-order blind identification (SOBI), an independent component analysis (ICA) method, to MEG data collected during cognitive tasks. We explored SOBI's ability to help isolate underlying neuronal sources with relatively poor signal-to-noise ratios, allowing their identification and localization. We compare localization of the SOBI-separated components to localization from unprocessed sensor signals, using an equivalent current dipole (ECD) modeling method.

  7. Caspase-Independent Mitotic Death

    Microsoft Academic Search

    Katsumi Kitagawa

    The spindle checkpoint ensures proper chromosomal segregation by monitoring kinetochore–microtubule attachment. A failure\\u000a of this checkpoint causes aneuploidy, which leads to tumorigenesis. The cell death that prevents the aneuploidy caused by\\u000a failure of the spindle checkpoint is yet unknown. We have identified a novel type of mitotic cell death, which we term caspase-independent\\u000a mitotic death (CIMD). When BUB1 but not

  8. Simulations with the COREDIV code of DEMO discharges

    NASA Astrophysics Data System (ADS)

    Zagórski, R.; Ivanova-Stanik, R. I.; Stankiewicz, R.

    2013-07-01

    The reduction in divertor target power load due to radiation of sputtered and externally seeded impurities in tokamak fusion reactors is investigated. The approach is based on integrated numerical modelling of DEMO discharges using the COREDIV code, which self-consistently solves 1D radial transport equations of plasma and impurities in the core region and 2D multifluid transport in the SOL. Calculations are performed for inductive DEMO scenarios and for DEMO steady-state configurations with tungsten walls and Ar or Ne seeding. For all considered DEMO scenarios significant fusion power can be achieved. Increase in seeded impurity influx leads to the reduction in fusion power and Q-factor (defined as the ratio of fusion power to auxiliary heating power) due to plasma dilution. Total radiation appears to be almost independent of the puffing level and is dominated by core radiation (>90%). The radiation due to seeding impurity is small and the type of seeded impurity weakly affects the results. For pulsed DEMO concepts, the accessible seeding level is limited. There is no steady-state solution for stronger puffing. The solution terminates due to helium accumulation, and if confirmed by more detailed investigations, might strongly affect DEMO design.

  9. TRIM: TR independent multislice imaging.

    PubMed

    Fautz, Hans-Peter; Paul, Dominik; Scheffler, Klaus; Hennig, Jürgen

    2004-06-01

    This article introduces a novel concept to overcome the dependence of image contrast on spatial positioning parameters such as the number of slices and slice separation in multislice measurements: TR-independent multislice (TRIM) acquisition allows the number of slices in a single measurement to remain independent of the repetition time TR. Ramped TRIM (rTRIM) allows the distance between the sections excited in each repetition to remain independent of the distance between the reconstructed slices. Even images from overlapping slices can be acquired without crosstalk between the images of adjacent slices due to spatially overlapping excitation profiles. This concept is based on a special reordering scheme: Within a single TR acquisition, steps are only taken from a fraction of all slices. This necessitates attribution of different phase-encoding steps to different slices within each repetition cycle. The reordering scheme can be derived by the use of a design matrix. The imaging properties of the technique are discussed theoretically and illustrated by a point spread function analysis based on simulations and phantom measurements. Potential sources of artifacts are identified and methods for their prevention are developed. Optimized implementations with different T(1)-weighted sequences such as spin echo (SE), turbo spin echo (TSE), and spoiled gradient echo acquisitions are shown on normal volunteers with imaging parameters used in routine diagnosis. PMID:15170845

  10. CB12319OCT12 MailCode505D

    E-print Network

    to speak with a nurse. * Krames StayWell is an independent company partnering with Blue Cross Blue ShieldR006376 CB12319OCT12 bcbsm.com MailCode505D 600E.LafayetteBlvd. Detroit,MI48226-2998 Get healthy the Health and Wellness tab. Step 4: Select BlueHealthConnection to begin. Engagement Center Our

  11. The APS SASE FEL : modeling and code comparison.

    SciTech Connect

    Biedron, S. G.

    1999-04-20

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  12. Adaptive differential pulse-code modulation with adaptive bit allocation

    Microsoft Academic Search

    E. D. Frangoulis; K. Yoshida; L. F. Turner

    1984-01-01

    Studies have been conducted regarding the possibility to obtain good quality speech at data rates in the range from 16 kbit\\/s to 32 kbit\\/s. The techniques considered are related to adaptive predictive coding (APC) and adaptive differential pulse-code modulation (ADPCM). At 16 kbit\\/s adaptive transform coding (ATC) has also been used. The present investigation is concerned with a new method

  13. Performance limits of coded diversity methods for transmitter antenna arrays

    Microsoft Academic Search

    Aradhana Narula; Mitchell D. Trott; Gregory W. Wornell

    1999-01-01

    Several aspects of the design and optimization of coded multiple-antenna transmission diversity methods for slowly time-varying channels are explored from an information-theoretic perspective. Both optimized vector-coded systems, which can achieve the maximum possible performance, and suboptimal scalar-coded systems, which reduce complexity by exploiting suitably designed linear precoding, are investigated. The achiev- able rates and associated outage characteristics of these spatial

  14. Independence Sequences of Well-Covered Graphs: Non-Unimodality and the Roller-Coaster Conjecture

    Microsoft Academic Search

    T. S. Michael; William N. Traves

    2003-01-01

    A graph G is well-covered provided each maximal independent set of vertices has the same cardinality. The term s k of the independence sequence ( s 0, s 1,…, s a) equals the number of independent k-sets of vertices of G. We investigate constraints on the linear orderings of the terms of the independence sequence of well-covered graphs. In particular,

  15. Architecture independent performance characterization andbenchmarking for scientific applications

    SciTech Connect

    Strohmaier, Erich; Shan, Hongzhang

    2004-08-31

    A simple, tunable, synthetic benchmark with a performance directly related to applications would be of great benefit to the scientific computing community. In this paper, we present a novel approach to develop such a benchmark. The initial focus of this project is on data access performance of scientific applications. First a hardware independent characterization of code performance in terms of address streams is developed. The parameters chosen to characterize a single address stream are related to regularity, size, spatial, and temporal locality. These parameters are then used to implement a synthetic benchmark program that mimics the performance of a corresponding code. To test the validity of our approach we performed experiments using five test kernels on six different platforms. The performance of most of our test kernels can be approximated by a single synthetic address stream. However in some cases overlapping two address streams is necessary to achieve a good approximation.

  16. TACO: a finite element heat transfer code

    SciTech Connect

    Mason, W.E. Jr.

    1980-02-01

    TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code.

  17. High-Fidelity Coding with Correlated Neurons

    PubMed Central

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  18. On decoding of multi-level MPSK modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Gupta, Alok Kumar

    1990-01-01

    The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.

  19. Lossless Source Coding Using Nested Error Correcting Codes

    Microsoft Academic Search

    Javad Haghighat; Walaa Hamouda; Mohammad Reza Soleymani

    2007-01-01

    We propose a tree-structured variable-length random binning scheme for lossless source coding. The existing source coding schemes based on turbo codes, low-density parity check codes, and repeat accumulate codes can be regarded as practical implementations of this random binning scheme. For sufficiently large data blocks, we show that the proposed scheme asymptotically achieves the entropy limit. We also derive the

  20. Bilinear sparse coding for invariant vision.

    PubMed

    Grimes, David B; Rao, Rajesh P N

    2005-01-01

    Recent algorithms for sparse coding and independent component analysis (ICA) have demonstrated how localized features can be learned from natural images. However, these approaches do not take image transformations into account. We describe an unsupervised algorithm for learning both localized features and their transformations directly from images using a sparse bilinear generative model. We show that from an arbitrary set of natural images, the algorithm produces oriented basis filters that can simultaneously represent features in an image and their transformations. The learned generative model can be used to translate features to different locations, thereby reducing the need to learn the same feature at multiple locations, a limitation of previous approaches to sparse coding and ICA. Our results suggest that by explicitly modeling the interaction between local image features and their transformations, the sparse bilinear approach can provide a basis for achieving transformation-invariant vision. PMID:15563747

  1. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode" but also "what to encode" to achieve UEP. Another advantage of the priority encoding process is that the majority of high-priority data can be decoded sooner since only a small number of code symbols are required to reconstruct high-priority data. This approach increases the likelihood that high-priority data is decoded first over low-priority data. The Prioritized LT code scheme achieves an improvement in high-priority data decoding performance as well as overall information recovery without penalizing the decoding of low-priority data, assuming high-priority data is no more than half of a message block. The cost is in the additional complexity required in the encoder. If extra computation resource is available at the transmitter, image, voice, and video transmission quality in terrestrial and space communications can benefit from accurate use of redundancy in protecting data with varying priorities.

  2. COLD-SAT Dynamic Model Computer Code

    NASA Technical Reports Server (NTRS)

    Bollenbacher, G.; Adams, N. S.

    1995-01-01

    COLD-SAT Dynamic Model (CSDM) computer code implements six-degree-of-freedom, rigid-body mathematical model for simulation of spacecraft in orbit around Earth. Investigates flow dynamics and thermodynamics of subcritical cryogenic fluids in microgravity. Consists of three parts: translation model, rotation model, and slosh model. Written in FORTRAN 77.

  3. Coding theorems for hybrid channels. II

    E-print Network

    A. A. Kuznetsova; A. S. Holevo

    2014-08-14

    The present work continues investigation of the capacities of measurement (quantum-classical) channels in the most general setting, initiated in~\\cite{HCT}. The proof of coding theorems is given for the classical capacity and entanglement-assisted classical capacity of the measurement channel with arbitrary output alphabet, without assuming that the channel is given by a bounded operator-valued density.

  4. Improvements of embedded zerotree wavelet (EZW) coding

    Microsoft Academic Search

    Jin Li; Po-Yuen Cheng; C.-C. J. Kuo

    1995-01-01

    In this research, we investigate several improvements of embedded zerotree wavelet (EZW) coding. Several topics addressed include: the choice of wavelet transforms and boundary conditions, the use of arithmetic coder and arithmetic context and the design of encoding order for effective embedding. The superior performance of our improvements is demonstrated with extensive experimental results.

  5. Allocentric coding: Spatial range and combination rules.

    PubMed

    Camors, D; Jouffrais, C; Cottereau, B R; Durand, J B

    2015-04-01

    When a visual target is presented with neighboring landmarks, its location can be determined both relative to the self (egocentric coding) and relative to these landmarks (allocentric coding). In the present study, we investigated (1) how allocentric coding depends on the distance between the targets and their surrounding landmarks (i.e. the spatial range) and (2) how allocentric and egocentric coding interact with each other across targets-landmarks distances (i.e. the combination rules). Subjects performed a memory-based pointing task toward previously gazed targets briefly superimposed (200ms) on background images of cluttered city landscapes. A variable portion of the images was occluded in order to control the distance between the targets and the closest potential landmarks within those images. The pointing responses were performed after large saccades and the reappearance of the images at their initial location. However, in some trials, the images' elements were slightly shifted (±3°) in order to introduce a subliminal conflict between the allocentric and egocentric reference frames. The influence of allocentric coding in the pointing responses was found to decrease with increasing target-landmarks distances, although it remained significant even at the largest distances (?10°). Interestingly, both the decreasing influence of allocentric coding and the concomitant increase in pointing responses variability were well captured by a Bayesian model in which the weighted combination of allocentric and egocentric cues is governed by a coupling prior. PMID:25749676

  6. Sex-specific norms code face identity.

    PubMed

    Rhodes, Gillian; Jaquet, Emma; Jeffery, Linda; Evangelista, Emma; Keane, Jill; Calder, Andrew J

    2011-01-01

    Face identity aftereffects suggest that an average face, which is continuously updated by experience, functions as a norm for coding identity. Sex-contingent figural face aftereffects indicate that different norms are maintained for male and female faces but do not directly implicate them in coding identity. Here, we investigated whether sex-specific norms are used to code the identities of male and female faces or whether a generic, androgynous norm is used for all faces. We measured identity aftereffects for adapt-test pairs that were opposite relative to a sex-specific average and pairs that were opposite relative to an androgynous average. Identity aftereffects are generally larger for adapt-test pairs that lie opposite an average face, which functions as a norm for coding identity, than those that do not. Therefore, we reasoned that whichever average gives the larger aftereffect would be closer to the true psychological norm. Aftereffects were substantially and significantly larger for pairs that lie opposite a sex-specific than an androgynous average. This difference remained significant after correcting for differences in test trajectory length. These results indicate that, despite the common structure shared by all faces, identity is coded using sex-specific norms. We suggest that the use of category-specific norms may increase coding efficiency and help us discriminate thousands of faces despite their similarity as patterns. PMID:21199895

  7. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  8. Bit-interleaved coded modulation in linear dispersion coded MIMO system over spatially correlated Rician fading channel

    Microsoft Academic Search

    Yuan Li; P. H. W. Fung; Yan Wu; Sumei Sun

    2004-01-01

    A multiple-input multiple-output (MIMO) system with serially concatenated bit-interleaved coded modulation (BICM) and linear dispersion code (LDC) is investigated. LDC is a member of the family of linear space-time block codes STBC. A tight upper bound for this system over a spatially correlated Rician fading channel is derived, based on the moment generation function (MGF) approach. We utilize this bound

  9. Linear block codes for block fading channels based on Hadamard matrices

    E-print Network

    Spyrou, Spyros

    2006-04-12

    We investigate the creation of linear block codes using Hadamard matrices for block fading channels. The aforementioned codes are very easy to find and have bounded cross correlation spectrum. The optimality is with respect to the metric...

  10. RAVE—a Detector-independent vertex reconstruction toolkit

    NASA Astrophysics Data System (ADS)

    Waltenberger, Wolfgang; Mitaroff, Winfried; Moser, Fabian

    2007-10-01

    A detector-independent toolkit for vertex reconstruction (RAVE ) is being developed, along with a standalone framework (VERTIGO ) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available. VERTIGO = "vertex reconstruction toolkit and interface to generic objects".

  11. UNITED WAY OF ANCHORAGE CODE WEB ADDRESS United Way of Anchorage 71830 www.liveunitedanchorage.org

    E-print Network

    Pantaleone, Jim

    UNITED WAY OF ANCHORAGE CODE WEB ADDRESS United Way of Anchorage 71830 www 35694 www.hfhanchorage.org Kids' Corps, Inc. 30881 kcialaska.org Lutheran Social Services of Alaska Inc.ywcaak.org INDEPENDENT CODE WEB ADDRESS Alaska Humane Society 36007 www.adopt-a-cat.org Alaska Marine Conservation

  12. JAERI (Japan Atomic Energy Research Institute)\\/US calculational benchmarks for nuclear data and codes intercomparison

    Microsoft Academic Search

    M. Youssef; J. Jung; M. Sawan; M. Nakagawa; T. Mori; T. Kosako

    1986-01-01

    Prior to analyzing the integral experiments performed at the Fast Neutron Source Facility at the Japan Atomic Energy Research Institute (JAERI), both U.S. and JAERI analysts have agreed on four calculational benchmark problems proposed by JAERI to intercompare results based on various codes and data bases used independently by both countries. To compare codes, the same data base is used

  13. Bit-Interleaved LDPC-Coded Modulation with Iterative Demapping and Decoding

    Microsoft Academic Search

    Qiuliang Xie; Kewu Peng; Jian Song; Zhixing Yang

    2009-01-01

    Bit-interleaved coded modulation (BICM) is a sub- optimal scheme from the average mutual information (AMI) point of view due to independent demapping. However, this AMI loss can be neglected for signal constellations with Gray mapping at high coding rates. AMI of amplitude-phase shift keying (APSK) constrained additive white Gaussian noise (AWGN) channel is provided in this paper, which shows that

  14. Code for America

    NSDL National Science Digital Library

    How might we bring local governments together to make better cities? Why not try Code for America? This compelling organization works to "help residents and governments harness technology to solve community problems." Its work is supported by a range of organizations, including Google, the Kauffman Foundation, and ESRI. The site contains a number of topical sections, including Governments, Citizens, and Apps. The Governments area contains links to the ten cities that are utilizing the services of Code For America to create ambitious projects designed to connect citizens to their government. Projects have included work in Oakland and Honolulu where citizen codes rewrote these cities' websites in one day each. Both Free Apps and Paid Apps can be found here and highlight a variety of compelling new projects, such as Adopt-A-Hydrant and the Jail Population Management Dashboard.

  15. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (?20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  16. Distributed Turbo Product Codes with Multiple Vertical Parities

    NASA Astrophysics Data System (ADS)

    Obiedat, Esam A.; Chen, Guotai; Cao, Lei

    2009-12-01

    We propose a Multiple Vertical Parities Distributed Turbo Product Code (MVP-DTPC) over cooperative network using block Bose Chaudhuri Hochquenghem (BCH) codes as component codes. The source broadcasts extended BCH coded frames to the destination and nearby relays. After decoding the received sequences, each relay constructs a product code by arranging the corrected bit sequences in rows and re-encoding them vertically using BCH as component codes to obtain an Incremental Redundancy (IR) for source's data. To obtain independent vertical parities from each relay in the same code space, we propose a new Circular Interleaver for source's data; different circular interleavers are used to interleave BCH rows before re-encoding vertically. The Maximum A posteriori Probability (MAP) decoding is achieved by applying maximum transfer of extrinsic information between the multiple decoding stages. This is employed in the modified turbo product decoder, which is proposed to cope with multiple parities. The a posteriori output from a vertical decoding stage is used to derive the soft extrinsic information, that are used as a priori input for the next horizontal decoding stage. Simulation results in Additive White Gaussian Noise (AWGN) channel using network scenarios show 0.3-0.5 dB gain improvement in Bit Error Rate (BER) performance over the non-cooperative Turbo Product Codes (TPC).

  17. Preliminary Results from the Application of Automated Adjoint Code Generation to CFL3D

    NASA Technical Reports Server (NTRS)

    Carle, Alan; Fagan, Mike; Green, Lawrence L.

    1998-01-01

    This report describes preliminary results obtained using an automated adjoint code generator for Fortran to augment a widely-used computational fluid dynamics flow solver to compute derivatives. These preliminary results with this augmented code suggest that, even in its infancy, the automated adjoint code generator can accurately and efficiently deliver derivatives for use in transonic Euler-based aerodynamic shape optimization problems with hundreds to thousands of independent design variables.

  18. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

  19. Coding Theory and Projective Spaces

    NASA Astrophysics Data System (ADS)

    Silberstein, Natalia

    2008-05-01

    The projective space of order n over a finite field F_q is a set of all subspaces of the vector space F_q^{n}. In this work, we consider error-correcting codes in the projective space, focusing mainly on constant dimension codes. We start with the different representations of subspaces in the projective space. These representations involve matrices in reduced row echelon form, associated binary vectors, and Ferrers diagrams. Based on these representations, we provide a new formula for the computation of the distance between any two subspaces in the projective space. We examine lifted maximum rank distance (MRD) codes, which are nearly optimal constant dimension codes. We prove that a lifted MRD code can be represented in such a way that it forms a block design known as a transversal design. The incidence matrix of the transversal design derived from a lifted MRD code can be viewed as a parity-check matrix of a linear code in the Hamming space. We find the properties of these codes which can be viewed also as LDPC codes. We present new bounds and constructions for constant dimension codes. First, we present a multilevel construction for constant dimension codes, which can be viewed as a generalization of a lifted MRD codes construction. This construction is based on a new type of rank-metric codes, called Ferrers diagram rank-metric codes. Then we derive upper bounds on the size of constant dimension codes which contain the lifted MRD code, and provide a construction for two families of codes, that attain these upper bounds. We generalize the well-known concept of a punctured code for a code in the projective space to obtain large codes which are not constant dimension. We present efficient enumerative encoding and decoding techniques for the Grassmannian. Finally we describe a search method for constant dimension lexicodes.

  20. Pulse code modulation telemetry - Properties of various binary modulation types

    Microsoft Academic Search

    E. L. Law

    1982-01-01

    The present investigation is concerned with a comparison of the performance of methods for the transmission of digital data, taking into account aspects of performance under simulated range conditions. Attention is given to radio frequency spectra, bit error rate performance, peak carrier deviation, premodulation filtering, receiver IF bandpass filtering, receiver\\/demodulator video bandwidth, pulse code modulation (PCM) codes, phase shift keying,

  1. Code-Mixing as a Bilingual Instructional Strategy

    ERIC Educational Resources Information Center

    Jiang, Yih-Lin Belinda; García, Georgia Earnest; Willis, Arlette Ingram

    2014-01-01

    This study investigated code-mixing practices, specifically the use of L2 (English) in an L1 (Chinese) class in a U.S. bilingual program. Our findings indicate that the code-mixing practices made and prompted by the teacher served five pedagogical functions: (a) to enhance students' bilingualism and bilingual learning, (b) to review and…

  2. Programming & storytelling: opportunities for learning about coding & composition

    Microsoft Academic Search

    Quinn Burke; Yasmin B. Kafai

    2010-01-01

    The focus of this paper is to investigate how writing computer programs can help children develop their storytelling and creative writing abilities. The process of writing a program---coding---has long been considered only in terms of computer science, but such coding is also reflective of the imaginative and narrative elements of fiction writing workshops. Writing to program can also serve as

  3. Sensitivity and uncertainty studies of the CRAC2 computer code

    Microsoft Academic Search

    D. C. Kocher; R. C. Ward; G. G. Killough; D. E. Jr. Dunning; B. B. Hicks; R. P. Jr. Hosker; J. Y. Ku; K. S. Rao

    1985-01-01

    This report presents a study of the sensitivity of early fatalities, early injuries, latent cancer fatalities, and economic costs for hypothetical nuclear reactor accidents as predicted by the CRAC2 computer code (CRAC = Calculation of Reactor Accident Consequences) to uncertainties in selected models and parameters used in the code. The sources of uncertainty that were investigated in the CRAC2 sensitivity

  4. Automatic Coding of Dialogue Acts in Collaboration Protocols

    ERIC Educational Resources Information Center

    Erkens, Gijsbert; Janssen, Jeroen

    2008-01-01

    Although protocol analysis can be an important tool for researchers to investigate the process of collaboration and communication, the use of this method of analysis can be time consuming. Hence, an automatic coding procedure for coding dialogue acts was developed. This procedure helps to determine the communicative function of messages in online…

  5. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  6. Extended quantum color coding

    SciTech Connect

    Hayashi, A.; Hashimoto, T.; Horibe, M. [Department of Applied Physics, Fukui University, Fukui 910-8507 (Japan)

    2005-01-01

    The quantum color coding scheme proposed by Korff and Kempe [e-print quant-ph/0405086] is easily extended so that the color coding quantum system is allowed to be entangled with an extra auxiliary quantum system. It is shown that in the extended scheme we need only {approx}2{radical}(N) quantum colors to order N objects in large N limit, whereas {approx}N/e quantum colors are required in the original nonextended version. The maximum success probability has asymptotics expressed by the Tracy-Widom distribution of the largest eigenvalue of a random Gaussian unitary ensemble (GUE) matrix.

  7. Extended quantum color coding

    NASA Astrophysics Data System (ADS)

    Hayashi, A.; Hashimoto, T.; Horibe, M.

    2005-01-01

    The quantum color coding scheme proposed by Korff and Kempe [e-print quant-ph/0405086] is easily extended so that the color coding quantum system is allowed to be entangled with an extra auxiliary quantum system. It is shown that in the extended scheme we need only ˜2N quantum colors to order N objects in large N limit, whereas ˜N/e quantum colors are required in the original nonextended version. The maximum success probability has asymptotics expressed by the Tracy-Widom distribution of the largest eigenvalue of a random Gaussian unitary ensemble (GUE) matrix.

  8. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements on this method as well as demonstrating its implementation for various algorithms. We also examine cryptographic techniques to achieve obfuscation including encrypted functions and offer a new application to digital signature algorithms. To better understand the lack of security proofs for obfuscation techniques, we examine in detail general theoretical models of obfuscation. We explain the need for formal models in order to obtain provable security and the progress made in this direction thus far. Finally we tackle the problem of verifying remote execution. We introduce some methods of verifying remote exponentiation computations and some insight into generic computation checking.

  9. CONCEPT computer code

    SciTech Connect

    Delene, J.

    1984-01-01

    CONCEPT is a computer code that will provide conceptual capital investment cost estimates for nuclear and coal-fired power plants. The code can develop an estimate for construction at any point in time. Any unit size within the range of about 400 to 1300 MW electric may be selected. Any of 23 reference site locations across the United States and Canada may be selected. PWR, BWR, and coal-fired plants burning high-sulfur and low-sulfur coal can be estimated. Multiple-unit plants can be estimated. Costs due to escalation/inflation and interest during construction are calculated.

  10. The Code Project

    NSDL National Science Digital Library

    The Code Project is an online repository of free tutorials, source code, and articles about a wide variety of programming languages. Sections devoted to C++, HTML, DirectX, and .NET are among the resources available on the site. Discussion forums and message boards are excellent places for developers to get quick answers to their questions from other members of the community (this requires a short registration). Featured articles and industry news keep the site up-to-date. There are some advertisements on the site, but they do not detract from the content.

  11. High Performance “Reach” Codes

    E-print Network

    Edelson, J.

    2011-01-01

    ? Energy Trust of Oregon ? National Grid ? NSTAR ? NYSERDA ? USGBC-LEED Prescriptive Path ? Western Massachusetts Electric ? MA + New England Collaborative for High Performance Schools ? Mass. modified and adopted as Stretch Code on May 12, 2009... codes to require compliance ? Incent leading programs 20 09 IE CC & P re 20 09 IE CC 20 12 IE CC N ET Z ER O b y 20 30 M as s. S tr et ch C od e -2 10 2 Su pp le m en t Co re P er fo...

  12. Image coding by differential pulse code modulation and transform coding techniques, a comparative survey

    Microsoft Academic Search

    H.-J. Grallert; W. Tengler

    1984-01-01

    Differential pulse code modulation (DPCM) and transform coding techniques for digital image transmission are described, and their relative advantages are compared. For high-quality video transmission, both DCPM and transform coding were found to need sufficient data protection. Initial experiments showed that in transform coding the decomposition of the image into blocks results in residual errors that do not impair picture

  13. ]Space Shuttle Independent Assessment Team

    NASA Technical Reports Server (NTRS)

    2000-01-01

    The Shuttle program is one of the most complex engineering activities undertaken anywhere in the world at the present time. The Space Shuttle Independent Assessment Team (SIAT) was chartered in September 1999 by NASA to provide an independent review of the Space Shuttle sub-systems and maintenance practices. During the period from October through December 1999, the team led by Dr. McDonald and comprised of NASA, contractor, and DOD experts reviewed NASA practices, Space Shuffle anomalies, as well as civilian and military aerospace experience. In performing the review, much of a very positive nature was observed by the SIAT, not the least of which was the skill and dedication of the workforce. It is in the unfortunate nature of this type of review that the very positive elements are either not mentioned or dwelt upon. This very complex program has undergone a massive change in structure in the last few years with the transition to a slimmed down, contractor-run operation, the Shuttle Flight Operations Contract (SFOC). This has been accomplished with significant cost savings and without a major incident. This report has identified significant problems that must be addressed to maintain an effective program. These problems are described in each of the Issues, Findings or Observations summarized, and unless noted, appear to be systemic in nature and not confined to any one Shuttle sub-system or element. Specifics are given in the body of the report, along with recommendations to improve the present systems.

  14. Community care: the independent sector.

    PubMed Central

    Barodawala, S.

    1996-01-01

    The independent sector, which consists of the voluntary and private sectors, is a vital element in supporting older people in the community. The voluntary sector, coordinated by the Council for Voluntary Service and the National Council for Voluntary Organisations, provides a variety of services, including practical help, reassurance and companionship, and advice, information, campaigning, and advocacy. The private sector owns all of the nursing homes and most of the residential homes and is gradually becoming more involved with the provision of services to help support older people in their own homes. With this increase in size and importance of the independent sector over recent years, there is now a real need for greater communication between the private, voluntary, and statutory agencies in any one region. In some areas, forums made up of representatives of these various sectors meet to discuss relevant issues and construct local policies, thus allowing a more coordinated approach to the delivery of services. Images p740-a p742-a PMID:8819449

  15. Data compression preserving statistical independence

    NASA Technical Reports Server (NTRS)

    Morduch, G. E.; Rice, W. M.

    1973-01-01

    The purpose of this study was to determine the optimum points of evaluation of data compressed by means of polynomial smoothing. It is shown that a set y of m statistically independent observations Y(t sub 1), Y(t sub 2), ... Y(t sub m) of a quantity X(t), which can be described by a (n-1)th degree polynomial in time, may be represented by a set Z of n statistically independent compressed observations Z (tau sub 1), Z (tau sub 2),...Z (tau sub n), such that The compressed set Z has the same information content as the observed set Y. the times tau sub 1, tau sub 2,.. tau sub n are the zeros of an nth degree polynomial P sub n, to whose definition and properties the bulk of this report is devoted. The polynomials P sub n are defined as functions of the observation times t sub 1, t sub 2,.. t sub n, and it is interesting to note that if the observation times are continuously distributed the polynomials P sub n degenerate to legendre polynomials. The proposed data compression scheme is a little more complex than those usually employed, but has the advantage of preserving all the information content of the original observations.

  16. Independent EEG sources are dipolar.

    PubMed

    Delorme, Arnaud; Palmer, Jason; Onton, Julie; Oostenveld, Robert; Makeig, Scott

    2012-01-01

    Independent component analysis (ICA) and blind source separation (BSS) methods are increasingly used to separate individual brain and non-brain source signals mixed by volume conduction in electroencephalographic (EEG) and other electrophysiological recordings. We compared results of decomposing thirteen 71-channel human scalp EEG datasets by 22 ICA and BSS algorithms, assessing the pairwise mutual information (PMI) in scalp channel pairs, the remaining PMI in component pairs, the overall mutual information reduction (MIR) effected by each decomposition, and decomposition 'dipolarity' defined as the number of component scalp maps matching the projection of a single equivalent dipole with less than a given residual variance. The least well-performing algorithm was principal component analysis (PCA); best performing were AMICA and other likelihood/mutual information based ICA methods. Though these and other commonly-used decomposition methods returned many similar components, across 18 ICA/BSS algorithms mean dipolarity varied linearly with both MIR and with PMI remaining between the resulting component time courses, a result compatible with an interpretation of many maximally independent EEG components as being volume-conducted projections of partially-synchronous local cortical field activity within single compact cortical domains. To encourage further method comparisons, the data and software used to prepare the results have been made available (http://sccn.ucsd.edu/wiki/BSSComparison). PMID:22355308

  17. Trellis Coded Spatial Modulation

    Microsoft Academic Search

    Raed Mesleh; Marco Di Renzo; Harald Haas; Peter M. Grant

    2010-01-01

    Trellis coded modulation (TCM) is a well known scheme that reduces power requirements without any bandwidth expansion. In TCM, only certain sequences of successive constellation points are allowed (mapping by set partitioning). The novel idea in this paper is to apply the TCM concept to the antenna constellation points of spatial modulation (SM). The aim is to enhance SM performance

  18. Breast Surgery Codes

    Cancer.gov

    Brea st C500–C509 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 19 Local tumor destruction, NOS No specimen was sent to pathology for

  19. Differential pulse code modulation

    NASA Technical Reports Server (NTRS)

    Herman, C. F. (inventor)

    1976-01-01

    A differential pulse code modulation (DPCM) encoding and decoding method is described along with an apparatus which is capable of transmission with minimum bandwidth. The apparatus is not affected by data transition density, requires no direct current (DC) response of the transmission link, and suffers from minimal ambiguity in resolution of the digital data.

  20. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  1. Pharynx Surgery Codes

    Cancer.gov

    Pharynx Tonsil C090–C099, Oropharynx C100–C109, Nasopharynx C110–C119 Pyriform Sinus C129, Hypopharynx C130–C139, Pharynx C140 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery

  2. Larynx Surgery Codes

    Cancer.gov

    Larynx C320–C329 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12 Electrocautery;

  3. Code Games for Kids

    NSDL National Science Digital Library

    2013-01-01

    This interactive game for children promotes pattern recognition by identifying the missing numbers in a pattern displayed on a safe lock. Players drag number tiles into the gaps and if successful unlock the safe and move onto a different challenge. There are two practice rounds and then ten codes to crack.

  4. Kidney Surgery Codes

    Cancer.gov

    Kidney, Renal Pelvis, and Ureter Kidney C649, Renal Pelvis C659, Ureter C669 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor

  5. Liver Surgery Codes

    Cancer.gov

    Liver and Intrahepatic Bile Ducts C220–C221 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic

  6. Rectum Surgery Codes

    Cancer.gov

    Rectum C209 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item #1294)

  7. Colon Surgery Codes

    Cancer.gov

    C olon C180–C189 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item

  8. Esophagus Surgery Codes

    Cancer.gov

    Esophagus C150–C159 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12

  9. Parotid Surgery Codes

    Cancer.gov

    Parotid and Other Unspecified Glands Parotid Gland C079, Major Salivary Glands C080–C089 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY

  10. Colon Surgery Codes

    Cancer.gov

    Colon C180–C189 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item

  11. Stomach Surgery Codes

    Cancer.gov

    Stomach C160–C169 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12 Electrocautery;

  12. Bladder Surgery Codes

    Cancer.gov

    Bladder C670–C679 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12

  13. CODE OF FEDERAL REGULATIONS

    EPA Science Inventory

    The Code of Federal Regulations (CFR) is an annually revised codification of the general and permanent rules published in the Federal Register by the executive departments and agencies of the Federal Government. The CFR is divided into 50 titles which represent broad areas subje...

  14. Managing Code Inspection Information

    Microsoft Academic Search

    H. Jack Barnard; Arthur L. Price

    1994-01-01

    Inspection data is difficult to gather and interpret. At AT&T Bell Laboratories, the authors have defined nine key metrics that software project managers can use to plan, monitor, and improve inspections. Graphs of these metrics expose problems early and can help managers evaluate the inspection process itself. The nine metrics are: total noncomment lines of source code inspected in thousands

  15. Spleen Surgery Codes

    Cancer.gov

    Spleen C42.2 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 19 Local tumor destruction, NOS Unknown whether a specimen was sent to pathology

  16. Odor Coding Sensor

    NASA Astrophysics Data System (ADS)

    Hayashi, Kenshi

    Odor is a one of important sensing parameters for human life. However, odor has not been quantified by a measuring instrument because of its vagueness. In this paper, a measuring of odor with odor coding, which are vector quantities of plural odor molecular information, and its applications are described.

  17. EuroCODE

    Microsoft Academic Search

    Patrick Firket; Karine Vermeylen

    1990-01-01

    Summary EuroCODE is a free of charge computer network open to clinicians involved in cancer research. This network allows communication using an electronic mail facility, consultation of databases and registration or randomization of patients in EORTC protocols via the online randomization system.

  18. Decode de Code

    NSDL National Science Digital Library

    In this activity, users must decode a scientific quote that has been encoded by the computer. The computer will generate an "alphabet" (either random or rotated) and then substitute every letter of the real quote with the computer generated alphabet's letter. To decode the code, you must look for patterns of letters and then substitute guesses for the real letters.

  19. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  20. Building Codes and Regulations.

    ERIC Educational Resources Information Center

    Fisher, John L.

    The hazard of fire is of great concern to libraries due to combustible books and new plastics used in construction and interiors. Building codes and standards can offer architects and planners guidelines to follow but these standards should be closely monitored, updated, and researched for fire prevention. (DS)

  1. Noncoherent multisource network coding

    Microsoft Academic Search

    Mahdi Jafari Siavoshani; Christina Fragouli; Suhas Diggavi

    2008-01-01

    We examine the problem of multiple sources transmitting information to one or more receivers that require the information from all the sources, over a network where the network nodes perform randomized network coding. We consider the noncoherent case, where neither the sources nor the receivers have any knowledge of the intermediate nodes operations. We formulate a model for this problem,

  2. Environmental Fluid Dynamics Code

    EPA Science Inventory

    The Environmental Fluid Dynamics Code (EFDC)is a state-of-the-art hydrodynamic model that can be used to simulate aquatic systems in one, two, and three dimensions. It has evolved over the past two decades to become one of the most widely used and technically defensible hydrodyn...

  3. Lung Surgery Codes

    Cancer.gov

    Lung C340–C349 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 19 Local tumor destruction or excision, NOS Unknown whether a specimen was

  4. Novel Business Uses of Independently Created Hyperlinks in the World Wide Web: Basic Mechanism and Examples

    Microsoft Academic Search

    Robert J. Schloss

    1996-01-01

    Advisory architecture permits on-the-fly delivery of hyperlinks (and rating codes and annotations) to World Wide Web content, where the links and the target content are created by one or more organizations independent of the organization that created the Web content the user is accessing. The user 's Web browser or proxy server accesses these advisories through a protocol with a

  5. A benchmark study for glacial isostatic adjustment codes

    NASA Astrophysics Data System (ADS)

    Spada, G.; Barletta, V. R.; Klemann, V.; Riva, R. E. M.; Martinec, Z.; Gasperini, P.; Lund, B.; Wolf, D.; Vermeersen, L. L. A.; King, M. A.

    2011-04-01

    The study of glacial isostatic adjustment (GIA) is gaining an increasingly important role within the geophysical community. Understanding the response of the Earth to loading is crucial in various contexts, ranging from the interpretation of modern satellite geodetic measurements (e.g. GRACE and GOCE) to the projections of future sea level trends in response to climate change. Modern modelling approaches to GIA are based on various techniques that range from purely analytical formulations to fully numerical methods. Despite various teams independently investigating GIA, we do not have a suitably large set of agreed numerical results through which the methods may be validated; a community benchmark data set would clearly be valuable. Following the example of the mantle convection community, here we present, for the first time, the results of a benchmark study of codes designed to model GIA. This has taken place within a collaboration facilitated through European Cooperation in Science and Technology (COST) Action ES0701. The approaches benchmarked are based on significantly different codes and different techniques. The test computations are based on models with spherical symmetry and Maxwell rheology and include inputs from different methods and solution techniques: viscoelastic normal modes, spectral-finite elements and finite elements. The tests involve the loading and tidal Love numbers and their relaxation spectra, the deformation and gravity variations driven by surface loads characterized by simple geometry and time history and the rotational fluctuations in response to glacial unloading. In spite of the significant differences in the numerical methods employed, the test computations show a satisfactory agreement between the results provided by the participants.

  6. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  7. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  8. GYOTO: a new general relativistic ray-tracing code

    E-print Network

    Frederic H. Vincent; Thibaut Paumard; Eric Gourgoulhon; Guy Perrin

    2011-09-22

    GYOTO, a general relativistic ray-tracing code, is presented. It aims at computing images of astronomical bodies in the vicinity of compact objects, as well as trajectories of massive bodies in relativistic environments. This code is capable of integrating the null and timelike geodesic equations not only in the Kerr metric, but also in any metric computed numerically within the 3+1 formalism of general relativity. Simulated images and spectra have been computed for a variety of astronomical targets, such as a moving star or a toroidal accretion structure. The underlying code is open source and freely available. It is user-friendly, quickly handled and very modular so that extensions are easy to integrate. Custom analytical metrics and astronomical targets can be implemented in C++ plug-in extensions independent from the main code.

  9. Bayes Formula Tree Diagrams Weighing the Odds Independence Conditional Probability and Independence

    E-print Network

    Watkins, Joseph C.

    Bayes Formula Tree Diagrams Weighing the Odds Independence Topic 6 Conditional Probability and Independence Bayes Formula and Independence 1 / 16 #12;Bayes Formula Tree Diagrams Weighing the Odds Independence Outline Bayes Formula Tree Diagrams Weighing the Odds Independence 2 / 16 #12;Bayes Formula Tree

  10. Constrained coding for the deep-space optical channel

    NASA Technical Reports Server (NTRS)

    Moision, B. E.; Hamkins, J.

    2002-01-01

    We investigate methods of coding for a channel subject to a large dead-time constraint, i.e. a constraint on the minimum spacing between transmitted pulses, with the deep-space optical channel as the motivating example.

  11. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC...code. (e) In States where traffic law violations are State criminal offenses, such laws are made applicable under the...

  12. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC...code. (e) In States where traffic law violations are State criminal offenses, such laws are made applicable under the...

  13. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...DEPARTMENT OF THE ARMY (CONTINUED) LAW ENFORCEMENT AND CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC...code. (e) In States where traffic law violations are State criminal offenses, such laws are made applicable under the...

  14. Coding and scheduling optimization over packet erasure broadcast channels

    E-print Network

    Zeng, Weifei

    2012-01-01

    Throughput and per-packet delay can present strong trade-offs that are important in the cases of delay sensitive applications. In this thesis, we investigate such trade-offs using a random linear network coding scheme for ...

  15. Multi-stage decoding of multi-level modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Costello, Daniel J., Jr.

    1991-01-01

    Various types of multi-stage decoding for multi-level modulation codes are investigated. It is shown that if the component codes of a multi-level modulation code and types of decoding at various stages are chosen properly, high spectral efficiency and large coding gain can be achieved with reduced decoding complexity. Particularly, it is shown that the difference in performance between the suboptimum multi-stage soft-decision maximum likelihood decoding of a modulation code and the single-stage optimum soft-decision decoding of the code is very small, only a fraction of dB loss in signal to noise ratio at a bit error rate (BER) of 10(exp -6).

  16. Error Control Coding Techniques for Space and Satellite Communications

    NASA Technical Reports Server (NTRS)

    Costello, Daniel J., Jr.; Takeshita, Oscar Y.; Cabral, Hermano A.

    1998-01-01

    It is well known that the BER performance of a parallel concatenated turbo-code improves roughly as 1/N, where N is the information block length. However, it has been observed by Benedetto and Montorsi that for most parallel concatenated turbo-codes, the FER performance does not improve monotonically with N. In this report, we study the FER of turbo-codes, and the effects of their concatenation with an outer code. Two methods of concatenation are investigated: across several frames and within each frame. Some asymmetric codes are shown to have excellent FER performance with an information block length of 16384. We also show that the proposed outer coding schemes can improve the BER performance as well by eliminating pathological frames generated by the iterative MAP decoding process.

  17. Patterns of behavior of professionally managed and independent investors

    Microsoft Academic Search

    Zur Shapira; Itzhak Venezia

    2001-01-01

    In this paper, we analyze the investment patterns of a large number of clients of a major Israeli brokerage house during 1994. We compare the behavior of clients making independent investment decisions to that of investors whose accounts were managed by brokerage professionals. Our main objective is to investigate whether the disposition effect (i.e., the tendency to sell winners quicker

  18. A passive dynamic walking quadruped with independently movable legs

    Microsoft Academic Search

    Yasutake Yamada; Toshihiro Kawakatsu; Akio Ishiguro

    2004-01-01

    This study is intended to deal with a passive dynamic walking quadruped with independently movable legs. Since no current investigation exists about whether such robots can be created or not, we attempt to develop one in a synthetic manner. More specifically, we employed a genetic algorithm to optimize body parameters. Through this synthetic design process, we found that the robots

  19. The Role of the Independent Counsel. Web Lesson.

    ERIC Educational Resources Information Center

    Constitutional Rights Foundation, Los Angeles, CA.

    This lesson presents an overview of the origin and procedure of the Independent Counsel Statute enacted in 1978 by the U.S. Congress. The lesson explains that the statute was drafted to eliminate the conflict of interest that might arise when the Department of Justice is ordered to investigate important government figures and that it gives the…

  20. The Field Dependence-Independence Construct: Some, One or None.

    ERIC Educational Resources Information Center

    Linn, Marcia C.; Kyllonen, Patrick

    1981-01-01

    The relationship between cognitive restructuring and perception of the upright (tests of which may be used to measure field dependence-independence [FDI]) was investigated. Data analysis of 34 tests administered to high school seniors, including 12 measures of FDI, resulted in five dimensions, including two associated with FDI. (Author/AEF)

  1. The Field Dependence-Independence Construct: Some, One, or None.

    ERIC Educational Resources Information Center

    Linn, Marcia C.; Kyllonen, Patrick

    The field dependency/independency construct (FDI) was measured using tests of perception of the upright such as the Rod and Frame Test (RFT) and tests of cognitive restructuring such as the Hidden Figures Test (HFT); relationships between cognitive restructing and perception of the upright were investigated. High school seniors received 34 tests…

  2. PDD Symptoms in ADHD, an Independent Familial Trait?

    ERIC Educational Resources Information Center

    Nijmeijer, J. S.; Hoekstra, P. J.; Minderaa, R. B.; Buitelaar, J. K.; Altink, M. E.; Buschgens, C. J. M.; Fliers, E. A.; Rommelse, N. N. J.; Sergeant, J. A.; Hartman, C. A.

    2009-01-01

    The aims of this study were to investigate whether subtle PDD symptoms in the context of ADHD are transmitted in families independent of ADHD, and whether PDD symptom familiality is influenced by gender and age. The sample consisted of 256 sibling pairs with at least one child with ADHD and 147 healthy controls, aged 5-19 years. Children who…

  3. Effective Teaching and Student Independence at Grade 12.

    ERIC Educational Resources Information Center

    Ayres, Paul; Sawyer, Wayne; Dinham, Steve

    This study investigated how five Australian teachers, who were considered to be exemplary in helping students develop independence, influenced and guided their students to extremely high grades in 12th grade. Teachers were observed teaching a lesson and then interviewed. The interviews asked them to identify successful outcomes of the lesson and…

  4. Consequences of Charge Independence for Nuclear Reactions Involving Photons

    Microsoft Academic Search

    Murray Gell-Mann; Valentine L. Telegdi

    1953-01-01

    Some effects of the charge independence of nuclear forces on the emission and absorption of photons by light nuclei are investigated. It is found that the selection rules governing the change of isotopic spin T in such transitions are of practical importance in nuclei with Tz=0, particularly the rule that E1 transitions without change of isotopic spin are forbidden. Two

  5. Glasgow Head Injury Outcome Prediction Program: an independent assessment

    Microsoft Academic Search

    Justin J Nissen; Patricia A Jones; David F Signorini; Lilian S Murray; Graham M Teasdale; J Douglas Miller

    1999-01-01

    Using an independent data set, the utility of the Glasgow Head Injury Outcome Prediction Program was investigated in terms of possible frequency of use and reliability of outcome prediction in patients with severe head injury, or haematoma requiring evacuation, or coma lasting 6 hours or more, in whom outcome had been reliably assessed at 6 to 24 months after injury.

  6. Independent Senior Women Who Travel Internationally: A Collective Case Study

    ERIC Educational Resources Information Center

    Jarrett, Barbara

    2010-01-01

    Nine independent women over age 55 who traveled internationally were investigated through a qualitative case study. The purpose of the study was to explore the women's attitudes, actions, and motivations during and after their international travel experiences. The adult, aging, experiential, and transformational theories of researchers such as…

  7. Distance Learning Enrollments in Independent Institutions. Feasibility Study.

    ERIC Educational Resources Information Center

    Washington State Higher Education Coordinating Board, Olympia.

    This study investigated the feasibility of collecting enrollment data on distance learning programs sponsored by private institutions within and outside of Washington State. E-commerce developments have allowed in-state independent providers and out-of-state public institutions to serve residents of Washington State, and many nontraditional…

  8. Obituary: Arthur Dodd Code (1923-2009)

    NASA Astrophysics Data System (ADS)

    Marché, Jordan D., II

    2009-12-01

    Former AAS president Arthur Dodd Code, age 85, passed away at Meriter Hospital in Madison, Wisconsin on 11 March 2009, from complications involving a long-standing pulmonary condition. Code was born in Brooklyn, New York on 13 August 1923, as the only child of former Canadian businessman Lorne Arthur Code and Jesse (Dodd) Code. An experienced ham radio operator, he entered the University of Chicago in 1940, but then enlisted in the U.S. Navy (1943-45) and was later stationed as an instructor at the Naval Research Laboratory, Washington, D.C. During the war, he gained extensive practical experience with the design and construction of technical equipment that served him well in years ahead. Concurrently, he took physics courses at George Washington University (some under the tutelage of George Gamow). In 1945, he was admitted to the graduate school of the University of Chicago, without having received his formal bachelor's degree. In 1950, he was awarded his Ph.D. for a theoretical study of radiative transfer in O- and B-type stars, directed by Subrahmanyan Chandrasekhar. hired onto the faculty of the Department of Astronomy at the University of Wisconsin-Madison (1951-56). He then accepted a tenured appointment at the California Institute of Technology and the Mount Wilson and Palomar Observatories (1956-58). But following the launch of Sputnik, Code returned to Wisconsin in 1958 as full professor of astronomy, director of the Washburn Observatory, and department chairman so that he could more readily pursue his interest in space astronomy. That same year, he was chosen a member of the Space Science Board of the National Academy of Sciences (created during the International Geophysical Year) and shortly became one of five principal investigators of the original NASA Space Science Working Group. In a cogent 1960 essay, Code argued that astrophysical investigations, when conducted from beyond the Earth's atmosphere, "cannot fail to have a tremendous impact on the future course of stellar astronomy," a prediction strongly borne out in the decades that followed. In 1959, Code founded the Space Astronomy Laboratory (SAL) within the UW Department of Astronomy. Early photometric and spectrographic equipment was test-flown aboard NASA's X-15 rocket plane and Aerobee sounding rockets. Along with other SAL personnel, including Theodore E. Houck, Robert C. Bless, and John F. McNall, Code (as principal investigator) was responsible for the design of the Wisconsin Experiment Package (WEP) as one of two suites of instruments to be flown aboard the Orbiting Astronomical Observatory (OAO), which represented a milestone in the advent of space astronomy. With its seven reflecting telescopes feeding five filter photometers and two scanning spectrometers, WEP permitted the first extended observations in the UV portion of the spectrum. After the complete failure of the OAO-1 spacecraft (launched in 1966), OAO-2 was successfully launched on 7 December 1968 and gathered data on over a thousand celestial objects during the next 50 months, including stars, nebulae, galaxies, planets, and comets. These results appeared in a series of more than 40 research papers, chiefly in the Ap.J., along with the 1972 monograph, The Scientific Results from the Orbiting Astronomical Observatory (OAO-2), edited by Code. Between the OAO launches, other SAL colleagues of Code developed the Wisconsin Automatic Photoelectric Telescope (or APT), the first computer-controlled (or "robotic") telescope. Driven by a PDP-8 mini-computer, it routinely collected atmospheric extinction data. Code was also chosen principal investigator for the Wisconsin Ultraviolet Photo-Polarimeter Experiment (or WUPPE). This used a UV-sensitive polarimeter designed by Kenneth Nordsieck that was flown twice aboard the space shuttles in 1990 and 1995. Among other findings, WUPPE observations demonstrated that interstellar dust does not appreciably change the direction of polarization of starlight, thereby supporting its possible composition as graphite. Code was the recipie

  9. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  10. Statistics Investigations

    NSDL National Science Digital Library

    2013-01-01

    This webpage contains statistics investigations in the form of word problems. The investigations are located on the left hand side of the page on the navigation bar: the links are "Recommended Investigations" and "Additional Investigations". Within each investigation there are additional links to external resources that can be used to solve or illustrate the problem.

  11. Falcon Codes: Fast, Authenticated LT Codes Cornell Tech

    E-print Network

    while maintaining very fast encoding/decoding times. One variant Falcon code works well with small layered encoding, Tornado codes [4] achieve encoding/decoding speeds that are 100 to 10000 times faster

  12. Code Patterns for Automatically Validating Requirements-to-Code Traces

    E-print Network

    Egyed, Alexander

    Code Patterns for Automatically Validating Requirements-to-Code Traces Achraf Ghabi Johannes Kepler University 4040 Linz, Austria achraf.ghabi@jku.at Alexander Egyed Johannes Kepler University 4040 Linz

  13. Computer-Based Coding of Occupation Codes for Epidemiological Analyses.

    PubMed

    Russ, Daniel E; Ho, Kwan-Yuet; Johnson, Calvin A; Friesen, Melissa C

    2014-05-01

    Mapping job titles to standardized occupation classification (SOC) codes is an important step in evaluating changes in health risks over time as measured in inspection databases. However, manual SOC coding is cost prohibitive for very large studies. Computer based SOC coding systems can improve the efficiency of incorporating occupational risk factors into large-scale epidemiological studies. We present a novel method of mapping verbatim job titles to SOC codes using a large table of prior knowledge available in the public domain that included detailed description of the tasks and activities and their synonyms relevant to each SOC code. Job titles are compared to our knowledge base to find the closest matching SOC code. A soft Jaccard index is used to measure the similarity between a previously unseen job title and the knowledge base. Additional information such as standardized industrial codes can be incorporated to improve the SOC code determination by providing additional context to break ties in matches. PMID:25221787

  14. Program Codes as of 6/22/12 Code Description

    E-print Network

    Qiu, Weigang

    Program Codes as of 6/22/12 Program Code Description ON TRANS? AFFECT BILL? @@ DELETE A POWER OR ACSK 030 DM REQUIRES MATH DP REQUIRES PHY SCI DR REQUIRES READING LOWER DW NEED ENGL 005 OR 015 OR ACSK

  15. Independent Lens Online Shorts Festival

    NSDL National Science Digital Library

    Since its creation a few years ago, the Independent Lens series has worked with various filmmakers and producers to create thoughtful portraits. These portraits have included subjects such as the life of Billy Strayhorn, people living with dystonia, and the world of Ethiopian coffee growers. Recently, they also embarked on yet another ambitious project: an online shorts festival. Visitors to this site can partake of all ten of these films at their leisure. Included are a film that explores a Parisian secret from 1951, a meditation of growing old, and an artist who created a monument out of mud, old paint, and adobe. After viewing the films, visitors are also welcome to leave their comments in the "Talkback" section, submit a film or find out more about the members of the jury for this online film festival.

  16. Independent Lens: A Lion's Trail

    NSDL National Science Digital Library

    The road to creating a popular song can take decades and often includes a number of incarnations before the listening public finally becomes interested. Such is the complex and at times painful story of the song "Mbube" (which is perhaps best known in the United States by the version titled "The Lion Sleeps Tonight"), which was first recorded by Solomon Linda and the Evening Birds in 1939 in South Africa. This compelling website, designed to complement an Independent Lens/PBS documentary, provides substantive background into the stories of the people associated with this song, and its rather nuanced history during the past seven decades. On the site, visitors can learn about the filmmakers, the song itself, and also provide their own feedback on the controversy surrounding the song and the documentary itself.

  17. ESA Intermediate Experimental Vehicle. Independent Aerothermodynamic Characterization and Aerodatabase Development

    NASA Astrophysics Data System (ADS)

    Rufolo, G.; Di Benedetto, S.; Walpot, L.; Roncioni, P.; Marini, M.

    2011-08-01

    In the frame of the Intermediate eXperimental Vehicle (IXV) project, the European Space Agency (ESA) is coordinating a series of technical assistance activities aimed at verifying and supporting the IXV industrial design and development process. The technical assistance is operated with the support of the Italian Space Agency (ASI), by means of the Italian Aerospace Research Center (CIRA), and the European Space Research and Technology Centre (ESTEC) under the super visioning and coordination of ESA IXV team. One of the purposes of the activity is to develop an independent capability for the assessment and verification of the industrial results with respect to the aerothermodynamic characterization of the IXV vehicle. To this aim CIRA is developing and independent AeroThermodynamics DataBase (ATDB), intended as a tool generating in output the time histories of local quantities (heat flux, pressure, skin friction) for each point of the IXV vehicle and for each trajectory (in a pre-defined envelope), together with an uncertainties model. The reference Computational Fluid Dynamics (CFD) solutions needed for the development of the tool have been provided by ESA- STEC (with the CFD code LORE) and CIRA (with the CFD code H3NS).

  18. Speeding Up Cosmological Boltzmann Codes

    E-print Network

    Michael Doran

    2005-09-05

    We introduce a novel strategy for cosmological Boltzmann codes leading to an increase in speed by a factor of \\sim 30 for small scale Fourier modes. We (re-)investigate the tight coupling approximation and obtain analytic formulae reaching up to the octupoles of photon intensity and polarization. This leads to accurate results reaching optimal precision, while still being simple. Damping rapid oscillations of small scale modes at later times, we simplify the integration of cosmological perturbations. We obtain analytic expressions for the photon density contrast and velocity as well as an estimate of the quadrupole from after last scattering until today. These analytic formulae hold well during re-ionization and are in fact negligible for realistic cosmological scenarios. However, they do extend the validity of our approach to models with very large optical depth to the last scattering surface.

  19. Two-dimensional aperture coding for magnetic sector mass spectrometry.

    PubMed

    Russell, Zachary E; Chen, Evan X; Amsden, Jason J; Wolter, Scott D; Danell, Ryan M; Parker, Charles B; Stoner, Brian R; Gehm, Michael E; Brady, David J; Glass, Jeffrey T

    2015-02-01

    In mass spectrometer design, there has been a historic belief that there exists a fundamental trade-off between instrument size, throughput, and resolution. When miniaturizing a traditional system, performance loss in either resolution or throughput would be expected. However, in optical spectroscopy, both one-dimensional (1D) and two-dimensional (2D) aperture coding have been used for many years to break a similar trade-off. To provide a viable path to miniaturization for harsh environment field applications, we are investigating similar concepts in sector mass spectrometry. Recently, we demonstrated the viability of 1D aperture coding and here we provide a first investigation of 2D coding. In coded optical spectroscopy, 2D coding is preferred because of increased measurement diversity for improved conditioning and robustness of the result. To investigate its viability in mass spectrometry, analytes of argon, acetone, and ethanol were detected using a custom 90-degree magnetic sector mass spectrometer incorporating 2D coded apertures. We developed a mathematical forward model and reconstruction algorithm to successfully reconstruct the mass spectra from the 2D spatially coded ion positions. This 2D coding enabled a 3.5× throughput increase with minimal decrease in resolution. Several challenges were overcome in the mass spectrometer design to enable this coding, including the need for large uniform ion flux, a wide gap magnetic sector that maintains field uniformity, and a high resolution 2D detection system for ion imaging. Furthermore, micro-fabricated 2D coded apertures incorporating support structures were developed to provide a viable design that allowed ion transmission through the open elements of the code. PMID:25510933

  20. Two-Dimensional Aperture Coding for Magnetic Sector Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Russell, Zachary E.; Chen, Evan X.; Amsden, Jason J.; Wolter, Scott D.; Danell, Ryan M.; Parker, Charles B.; Stoner, Brian R.; Gehm, Michael E.; Brady, David J.; Glass, Jeffrey T.

    2015-02-01

    In mass spectrometer design, there has been a historic belief that there exists a fundamental trade-off between instrument size, throughput, and resolution. When miniaturizing a traditional system, performance loss in either resolution or throughput would be expected. However, in optical spectroscopy, both one-dimensional (1D) and two-dimensional (2D) aperture coding have been used for many years to break a similar trade-off. To provide a viable path to miniaturization for harsh environment field applications, we are investigating similar concepts in sector mass spectrometry. Recently, we demonstrated the viability of 1D aperture coding and here we provide a first investigation of 2D coding. In coded optical spectroscopy, 2D coding is preferred because of increased measurement diversity for improved conditioning and robustness of the result. To investigate its viability in mass spectrometry, analytes of argon, acetone, and ethanol were detected using a custom 90-degree magnetic sector mass spectrometer incorporating 2D coded apertures. We developed a mathematical forward model and reconstruction algorithm to successfully reconstruct the mass spectra from the 2D spatially coded ion positions. This 2D coding enabled a 3.5× throughput increase with minimal decrease in resolution. Several challenges were overcome in the mass spectrometer design to enable this coding, including the need for large uniform ion flux, a wide gap magnetic sector that maintains field uniformity, and a high resolution 2D detection system for ion imaging. Furthermore, micro-fabricated 2D coded apertures incorporating support structures were developed to provide a viable design that allowed ion transmission through the open elements of the code.