Science.gov

Sample records for investigators independently coded

  1. The design of relatively machine-independent code generators

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.

    1979-01-01

    Two complementary approaches were investigated. In the first approach software design techniques were used to design the structure of a code generator for Halmat. The major result was the development of an intermediate code form known as 7UP. The second approach viewed the problem as one in providing a tool to the code generator programmer. The major result was the development of a non-procedural, problem oriented language known as CGGL (Code Generator Generator Language).

  2. An Investigation of Different String Coding Methods.

    ERIC Educational Resources Information Center

    Goyal, Pankaj

    1984-01-01

    Investigates techniques for automatic coding of English language strings which involve titles drawn from bibliographic files, but do not require prior knowledge of source. Coding methods (basic, maximum entropy principle), results of test using 6,260 titles from British National Bibliography, and variations in code element ordering are…

  3. Independent rate and temporal coding in hippocampal pyramidal cells

    PubMed Central

    Huxter, John; Burgess, Neil; O’Keefe, John

    2009-01-01

    Hippocampal pyramidal cells use temporal 1 as well as rate coding 2 to signal spatial aspects of the animal’s environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal EEG theta rhythm (Figure 1?; 1). These two codes could each represent a different variable 3,4. However, this requires that rate and phase can vary independently, in contrast to recent suggestions 5,6 that they are tightly coupled: both reflecting the amplitude of the cell’s input. Here we show that the time of firing and firing rate are dissociable and can represent two independent variables, viz, the animal’s location within the place field and its speed of movement through the field, respectively. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory 7 8 or a more general role in relational/declarative memory9,10. PMID:14574410

  4. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  5. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  6. Independent accident investigation: a modern safety tool.

    PubMed

    Stoop, John A

    2004-07-26

    Historically, safety has been subjected to a fragmented approach. In the past, every department has had its own responsibility towards safety, focusing either on working conditions, internal safety, external safety, rescue and emergency, public order or security. They each issued policy documents, which in their time were leading statements for elaboration and regulation. They also addressed safety issues with tools of various nature, often specifically developed within their domain. Due to a series of major accidents and disasters, the focus of attention is shifting from complying with quantitative risk standards towards intervention in primary operational processes, coping with systemic deficiencies and a more integrated assessment of safety in its societal context. In The Netherlands recognition of the importance of independent investigations has led to an expansion of this philosophy from the transport sector to other sectors. The philosophy now covers transport, industry, defense, natural disaster, environment and health and other major occurrences such as explosions, fires, and collapse of buildings or structures. In 2003 a multi-sector covering law will establish an independent safety board in The Netherlands. At a European level, mandatory investigation agencies are recognized as indispensable safety instruments for aviation, railways and the maritime sector, for which EU Directives are in place or being progressed [Transport accident and incident investigation in the European Union, European Transport Safety Council, ISBN 90-76024-10-3, Brussel, 2001]. Due to a series of major events, attention has been drawn to the consequences of disasters, highlighting the involvement of rescue and emergency services. They also have become subjected to investigative efforts, which in return, puts demands on investigation methodology. This paper comments on an evolutionary development in safety thinking and of safety boards, highlighting some consequences for strategic perspectives in a further development of independent accident investigation. PMID:15231346

  7. Error minimization and coding triplet/binding site associations are independent features of the canonical genetic code.

    PubMed

    Caporaso, J Gregory; Yarus, Michael; Knight, Rob

    2005-11-01

    The canonical genetic code has been reported both to be error minimizing and to show stereochemical associations between coding triplets and binding sites. In order to test whether these two properties are unexpectedly overlapping, we generated 200,000 randomized genetic codes using each of five randomization schemes, with and without randomization of stop codons. Comparison of the code error (difference in polar requirement for single-nucleotide codon interchanges) with the coding triplet concentrations in RNA binding sites for eight amino acids shows that these properties are independent and uncorrelated. Thus, one is not the result of the other, and error minimization and triplet associations probably arose independently during the history of the genetic code. We explicitly show that prior fixation of a stereochemical core is consistent with an effective later minimization of error. PMID:16211428

  8. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  9. Two independent transcription initiation codes overlap on vertebrate core promoters

    NASA Astrophysics Data System (ADS)

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable their expression in the two regulatory environments. The dissection of overlapping core promoter determinants represents a framework for future studies of promoter structure and function across different regulatory contexts.

  10. Two independent transcription initiation codes overlap on vertebrate core promoters.

    PubMed

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van IJcken, Wilfred F J; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-20

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable their expression in the two regulatory environments. The dissection of overlapping core promoter determinants represents a framework for future studies of promoter structure and function across different regulatory contexts. PMID:24531765

  11. Independent verification and benchmark testing of the UNSAT-H computer code, Version 2.0

    SciTech Connect

    Baca, R.G.; Magnuson, S.O.

    1990-02-01

    Independent testing of the UNSAT-H computer code, Version 2.0, was conducted to establish confidence that the code is ready for general use in performance assessment applications. Verification and benchmark test problems were used to check the correctness of the FORTRAN coding, computational efficiency and accuracy of the numerical algorithm, and code, capability to simulate diverse hydrologic conditions. This testing was performed using a structured and quantitative evaluation protocol. The protocol consisted of: blind testing, independent applications, maintaining test equivalence and use of graduated test cases. Graphical comparisons and calculation of the relative root mean square (RRMS) values were used as indicators of accuracy and consistency levels. Four specific ranges of RRMS values were chosen for in judging the quality of the comparison. Four verification test problems were used to check the computational accuracy of UNSAT-H in solving the uncoupled fluid flow and heat transport equations. Five benchmark test problems, ranging in complexity, were used to check the code`s simulation capability. Some of the benchmark test cases include comparisons with laboratory and field data. The primary findings of this independent testing is that the UNSAT-H is fully operationaL In general, the test results showed that computer code produced unsaturated flow simulations with excellent stability, reasonable accuracy, and acceptable speed. This report describes the technical basis, approach, and results of the independent testing. A number of future refinements to the UNSAT-H code are recommended that would improve: computational speed and accuracy, code usability and code portability. Aspects of the code that warrant further testing are outlined.

  12. Independent verification and benchmark testing of the UNSAT-H computer code, Version 2. 0

    SciTech Connect

    Baca, R.G.; Magnuson, S.O.

    1990-02-01

    Independent testing of the UNSAT-H computer code, Version 2.0, was conducted to establish confidence that the code is ready for general use in performance assessment applications. Verification and benchmark test problems were used to check the correctness of the FORTRAN coding, computational efficiency and accuracy of the numerical algorithm, and code, capability to simulate diverse hydrologic conditions. This testing was performed using a structured and quantitative evaluation protocol. The protocol consisted of: blind testing, independent applications, maintaining test equivalence and use of graduated test cases. Graphical comparisons and calculation of the relative root mean square (RRMS) values were used as indicators of accuracy and consistency levels. Four specific ranges of RRMS values were chosen for in judging the quality of the comparison. Four verification test problems were used to check the computational accuracy of UNSAT-H in solving the uncoupled fluid flow and heat transport equations. Five benchmark test problems, ranging in complexity, were used to check the code's simulation capability. Some of the benchmark test cases include comparisons with laboratory and field data. The primary findings of this independent testing is that the UNSAT-H is fully operationaL In general, the test results showed that computer code produced unsaturated flow simulations with excellent stability, reasonable accuracy, and acceptable speed. This report describes the technical basis, approach, and results of the independent testing. A number of future refinements to the UNSAT-H code are recommended that would improve: computational speed and accuracy, code usability and code portability. Aspects of the code that warrant further testing are outlined.

  13. RBMK coupled neutronics/thermal-hydraulics analyses by two independent code systems

    SciTech Connect

    Parisi, C.; D'Auria, F.; Malofeev, V.; Ivanov, B.; Ivanov, K.

    2006-07-01

    This paper presents the coupled neutronics/thermal-hydraulics activities carried out in the framework of the part B of the TACIS project R2.03/97, 'Software development for accident analysis of RBMK reactors in Russia'. Two independent code systems were assembled, one from the Russian side and the other from the Western side, for studying RBMK core transients. The Russian code system relies on the use of code UNK for neutron data libraries generation and the three-dimensional neutron kinetics thermal-hydraulics coupled codes BARS-KORSAR for plant transient analyses. The Western code system is instead based on the lattice physics code HELIOS and on the RELAP5-3D C code. Several activities were performed for testing code system's capabilities: the neutron data libraries were calculated and verified by precise Monte Carlo calculations, the coupled codes' steady state results were compared with plant detectors' data, and calculations of several transients were compared. Finally, both code systems proved to have all the capabilities for addressing reliable safety analyses of RBMK reactors. (authors)

  14. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  15. Signal-independent timescale analysis (SITA) and its application for neural coding during reaching and walking

    PubMed Central

    Zacksenhouse, Miriam; Lebedev, Mikhail A.; Nicolelis, Miguel A. L.

    2014-01-01

    What are the relevant timescales of neural encoding in the brain? This question is commonly investigated with respect to well-defined stimuli or actions. However, neurons often encode multiple signals, including hidden or internal, which are not experimentally controlled, and thus excluded from such analysis. Here we consider all rate modulations as the signal, and define the rate-modulations signal-to-noise ratio (RM-SNR) as the ratio between the variance of the rate and the variance of the neuronal noise. As the bin-width increases, RM-SNR increases while the update rate decreases. This tradeoff is captured by the ratio of RM-SNR to bin-width, and its variations with the bin-width reveal the timescales of neural activity. Theoretical analysis and simulations elucidate how the interactions between the recovery properties of the unit and the spectral content of the encoded signals shape this ratio and determine the timescales of neural coding. The resulting signal-independent timescale analysis (SITA) is applied to investigate timescales of neural activity recorded from the motor cortex of monkeys during: (i) reaching experiments with Brain-Machine Interface (BMI), and (ii) locomotion experiments at different speeds. Interestingly, the timescales during BMI experiments did not change significantly with the control mode or training. During locomotion, the analysis identified units whose timescale varied consistently with the experimentally controlled speed of walking, though the specific timescale reflected also the recovery properties of the unit. Thus, the proposed method, SITA, characterizes the timescales of neural encoding and how they are affected by the motor task, while accounting for all rate modulations. PMID:25191263

  16. Hundreds of conserved non-coding genomic regions are independently lost in mammals

    PubMed Central

    Hiller, Michael; Schaar, Bruce T.; Bejerano, Gill

    2012-01-01

    Conserved non-protein-coding DNA elements (CNEs) often encode cis-regulatory elements and are rarely lost during evolution. However, CNE losses that do occur can be associated with phenotypic changes, exemplified by pelvic spine loss in sticklebacks. Using a computational strategy to detect complete loss of CNEs in mammalian genomes while strictly controlling for artifacts, we find >600 CNEs that are independently lost in at least two mammalian lineages, including a spinal cord enhancer near GDF11. We observed several genomic regions where multiple independent CNE loss events happened; the most extreme is the DIAPH2 locus. We show that CNE losses often involve deletions and that CNE loss frequencies are non-uniform. Similar to less pleiotropic enhancers, we find that independently lost CNEs are shorter, slightly less constrained and evolutionarily younger than CNEs without detected losses. This suggests that independently lost CNEs are less pleiotropic and that pleiotropic constraints contribute to non-uniform CNE loss frequencies. We also detected 35 CNEs that are independently lost in the human lineage and in other mammals. Our study uncovers an interesting aspect of the evolution of functional DNA in mammalian genomes. Experiments are necessary to test if these independently lost CNEs are associated with parallel phenotype changes in mammals. PMID:23042682

  17. Investigations into resting-state connectivity using independent component analysis

    PubMed Central

    Beckmann, Christian F; DeLuca, Marilena; Devlin, Joseph T; Smith, Stephen M

    2005-01-01

    Inferring resting-state connectivity patterns from functional magnetic resonance imaging (fMRI) data is a challenging task for any analytical technique. In this paper, we review a probabilistic independent component analysis (PICA) approach, optimized for the analysis of fMRI data, and discuss the role which this exploratory technique can take in scientific investigations into the structure of these effects. We apply PICA to fMRI data acquired at rest, in order to characterize the spatio-temporal structure of such data, and demonstrate that this is an effective and robust tool for the identification of low-frequency resting-state patterns from data acquired at various different spatial and temporal resolutions. We show that these networks exhibit high spatial consistency across subjects and closely resemble discrete cortical functional networks such as visual cortical areas or sensory–motor cortex. PMID:16087444

  18. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    PubMed Central

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50?km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

  19. Board Governance of Independent Schools: A Framework for Investigation

    ERIC Educational Resources Information Center

    McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey

    2006-01-01

    Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school…

  20. RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1

    SciTech Connect

    Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E.

    1996-10-01

    Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available.

  1. The investigation of bandwidth efficient coding and modulation techniques

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.

  2. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test).

  3. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    NASA Astrophysics Data System (ADS)

    Szplet, R.; Jachna, Z.; Kwiatkowski, P.; Rozyc, K.

    2013-03-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines.

  4. Independent code assessment at BNL in FY 1982. [TRAC-PF1; RELAP5/MOD1; TRAC-BD1

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.

    1982-01-01

    Independent assessment of the advanced codes such as TRAC and RELAP5 has continued at BNL through the Fiscal Year 1982. The simulation tests can be grouped into the following five categories: critical flow, counter-current flow limiting (CCFL) or flooding, level swell, steam generator thermal performance, and natural circulation. TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed by simulating all of the above experiments, whereas the TRAC-BD1 (Version 12.0) code was applied only to the CCFL tests. Results and conclusions of the BNL code assessment activity of FY 1982 are summarized below.

  5. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  6. Norepinephrine Modulates Coding of Complex Vocalizations in the Songbird Auditory Cortex Independent of Local Neuroestrogen Synthesis

    PubMed Central

    Ikeda, Maaya Z.; Jeon, Sung David; Cowell, Rosemary A.

    2015-01-01

    The catecholamine norepinephrine plays a significant role in auditory processing. Most studies to date have examined the effects of norepinephrine on the neuronal response to relatively simple stimuli, such as tones and calls. It is less clear how norepinephrine shapes the detection of complex syntactical sounds, as well as the coding properties of sensory neurons. Songbirds provide an opportunity to understand how auditory neurons encode complex, learned vocalizations, and the potential role of norepinephrine in modulating the neuronal computations for acoustic communication. Here, we infused norepinephrine into the zebra finch auditory cortex and performed extracellular recordings to study the modulation of song representations in single neurons. Consistent with its proposed role in enhancing signal detection, norepinephrine decreased spontaneous activity and firing during stimuli, yet it significantly enhanced the auditory signal-to-noise ratio. These effects were all mimicked by clonidine, an ?-2 receptor agonist. Moreover, a pattern classifier analysis indicated that norepinephrine enhanced the ability of single neurons to accurately encode complex auditory stimuli. Because neuroestrogens are also known to enhance auditory processing in the songbird brain, we tested the hypothesis that norepinephrine actions depend on local estrogen synthesis. Neither norepinephrine nor adrenergic receptor antagonist infusion into the auditory cortex had detectable effects on local estradiol levels. Moreover, pretreatment with fadrozole, a specific aromatase inhibitor, did not block norepinephrine's neuromodulatory effects. Together, these findings indicate that norepinephrine enhances signal detection and information encoding for complex auditory stimuli by suppressing spontaneous “noise” activity and that these actions are independent of local neuroestrogen synthesis. PMID:26109659

  7. A coding-independent function of an alternative Ube3a transcript during neuronal development.

    PubMed

    Valluy, Jeremy; Bicker, Silvia; Aksoy-Aksel, Ayla; Lackinger, Martin; Sumer, Simon; Fiore, Roberto; Wüst, Tatjana; Seffer, Dominik; Metge, Franziska; Dieterich, Christoph; Wöhr, Markus; Schwarting, Rainer; Schratt, Gerhard

    2015-05-01

    The E3 ubiquitin ligase Ube3a is an important regulator of activity-dependent synapse development and plasticity. Ube3a mutations cause Angelman syndrome and have been associated with autism spectrum disorders (ASD). However, the biological significance of alternative Ube3a transcripts generated in mammalian neurons remains unknown. We report here that Ube3a1 RNA, a transcript that encodes a truncated Ube3a protein lacking catalytic activity, prevents exuberant dendrite growth and promotes spine maturation in rat hippocampal neurons. Surprisingly, Ube3a1 RNA function was independent of its coding sequence but instead required a unique 3' untranslated region and an intact microRNA pathway. Ube3a1 RNA knockdown increased activity of the plasticity-regulating miR-134, suggesting that Ube3a1 RNA acts as a dendritic competing endogenous RNA. Accordingly, the dendrite-growth-promoting effect of Ube3a1 RNA knockdown in vivo is abolished in mice lacking miR-134. Taken together, our results define a noncoding function of an alternative Ube3a transcript in dendritic protein synthesis, with potential implications for Angelman syndrome and ASD. PMID:25867122

  8. Investigation of combined unfolding of neutron spectra using the UMG unfolding codes.

    PubMed

    Roberts, N J

    2007-01-01

    An investigation of the simultaneous unfolding of data from neutron spectrometers using the UMG codes MAXED and GRAVEL has been performed. This approach involves combining the data from the spectrometers before unfolding, thereby performing a single combined unfolding of all the data to yield a final combined spectrum. The study used measured data from three proton recoil counters and also Bonner sphere and proton recoil counter responses calculated from their response functions. In each case, the spectrum derived from combined unfolding is compared with either the spectrum obtained from merging the independently unfolded spectra or the spectrum used to calculate the responses. The advantages and disadvantages of this technique are discussed. PMID:17502320

  9. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1992-01-01

    The performance of forward error correcting coding schemes on errors anticipated for the Earth Observation System (EOS) Ku-band downlink are studied. The EOS transmits picture frame data to the ground via the Telemetry Data Relay Satellite System (TDRSS) to a ground-based receiver at White Sands. Due to unintentional RF interference from other systems operating in the Ku band, the noise at the receiver is non-Gaussian which may result in non-random errors output by the demodulator. That is, the downlink channel cannot be modeled by a simple memoryless Gaussian-noise channel. From previous experience, it is believed that those errors are bursty. The research proceeded by developing a computer based simulation, called Communication Link Error ANalysis (CLEAN), to model the downlink errors, forward error correcting schemes, and interleavers used with TDRSS. To date, the bulk of CLEAN was written, documented, debugged, and verified. The procedures for utilizing CLEAN to investigate code performance were established and are discussed.

  10. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning

    NASA Astrophysics Data System (ADS)

    Fracchiolla, F.; Lorentini, S.; Widesott, L.; Schwarz, M.

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ -Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ -PR  >  93%) than the one coming from TPS (γ -PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  11. Field Dependence/Independence Cognitive Style and Problem Posing: An Investigation with Sixth Grade Students

    ERIC Educational Resources Information Center

    Nicolaou, Aristoklis Andreas; Xistouri, Xenia

    2011-01-01

    Field dependence/independence cognitive style was found to relate to general academic achievement and specific areas of mathematics; in the majority of studies, field-independent students were found to be superior to field-dependent students. The present study investigated the relationship between field dependence/independence cognitive style and…

  12. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization study is carried out using a geometric mean approach. Following this, sensitivity analyses with the aid of variance-based non-parametric approach and partial correlation coefficients are conducted using data available from surrogate models of the objectives and the multi-objective optima to identify the contribution of the design variables to the objective variability and to analyze the variability of the design variables and the objectives. In summary the present dissertation offers insight into an improved coarse to fine grid extrapolation technique for Navier-Stokes computations and also suggests tools for a designer to conduct design optimization study and related sensitivity analyses for a given design problem.

  13. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning.

    PubMed

    Fracchiolla, F; Lorentini, S; Widesott, L; Schwarz, M

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ-Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ-PR  >  93%) than the one coming from TPS (γ-PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process. PMID:26501569

  14. tRNA acceptor stem and anticodon bases form independent codes related to protein folding.

    PubMed

    Carter, Charles W; Wolfenden, Richard

    2015-06-16

    Aminoacyl-tRNA synthetases recognize tRNA anticodon and 3' acceptor stem bases. Synthetase Urzymes acylate cognate tRNAs even without anticodon-binding domains, in keeping with the possibility that acceptor stem recognition preceded anticodon recognition. Representing tRNA identity elements with two bits per base, we show that the anticodon encodes the hydrophobicity of each amino acid side-chain as represented by its water-to-cyclohexane distribution coefficient, and this relationship holds true over the entire temperature range of liquid water. The acceptor stem codes preferentially for the surface area or size of each side-chain, as represented by its vapor-to-cyclohexane distribution coefficient. These orthogonal experimental properties are both necessary to account satisfactorily for the exposed surface area of amino acids in folded proteins. Moreover, the acceptor stem codes correctly for ?-branched and carboxylic acid side-chains, whereas the anticodon codes for a wider range of such properties, but not for size or ?-branching. These and other results suggest that genetic coding of 3D protein structures evolved in distinct stages, based initially on the size of the amino acid and later on its compatibility with globular folding in water. PMID:26034281

  15. tRNA acceptor stem and anticodon bases form independent codes related to protein folding

    PubMed Central

    Carter, Charles W.; Wolfenden, Richard

    2015-01-01

    Aminoacyl-tRNA synthetases recognize tRNA anticodon and 3? acceptor stem bases. Synthetase Urzymes acylate cognate tRNAs even without anticodon-binding domains, in keeping with the possibility that acceptor stem recognition preceded anticodon recognition. Representing tRNA identity elements with two bits per base, we show that the anticodon encodes the hydrophobicity of each amino acid side-chain as represented by its water-to-cyclohexane distribution coefficient, and this relationship holds true over the entire temperature range of liquid water. The acceptor stem codes preferentially for the surface area or size of each side-chain, as represented by its vapor-to-cyclohexane distribution coefficient. These orthogonal experimental properties are both necessary to account satisfactorily for the exposed surface area of amino acids in folded proteins. Moreover, the acceptor stem codes correctly for ?-branched and carboxylic acid side-chains, whereas the anticodon codes for a wider range of such properties, but not for size or ?-branching. These and other results suggest that genetic coding of 3D protein structures evolved in distinct stages, based initially on the size of the amino acid and later on its compatibility with globular folding in water. PMID:26034281

  16. Coding tools investigation for next generation video coding based on HEVC

    NASA Astrophysics Data System (ADS)

    Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin

    2015-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.

  17. Investigation of a panel code for airframe/propeller integration analyses

    NASA Technical Reports Server (NTRS)

    Miley, S. J.

    1982-01-01

    The Hess panel code was investigated as a procedure to predict the aerodynamic loading associated with propeller slipstream interference on the airframe. The slipstream was modeled as a variable onset flow to the lifting and nonlifting bodies treated by the code. Four sets of experimental data were used for comparisons with the code. The results indicate that the Hess code, in its present form, will give valid solutions for nonuniform onset flows which vary in direction only. The code presently gives incorrect solutions for flows with variations in velocity. Modifications to the code to correct this are discussed.

  18. Investigating the Language and Literacy Skills Required for Independent Online Learning

    ERIC Educational Resources Information Center

    Silver-Pacuilla, Heidi

    2008-01-01

    This investigation was undertaken to investigate the threshold levels of literacy and language proficiency necessary for adult learners to use the Internet for independent learning. The report is triangulated around learning from large-scale surveys, learning from the literature, and learning from the field. Reported findings include: (1)…

  19. Independent assessment of TRAC-PD2 and RELAP5/MOD1 codes at BNL in FY 1981. [PWR

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G

    1982-12-01

    This report documents the independent assessment calculations performed with the TRAC-PD2 and RELAP/MOD1 codes at Brookhaven National Laboratory (BNL) during Fiscal Year 1981. A large variety of separate-effects experiments dealing with (1) steady-state and transient critical flow, (2) level swell, (3) flooding and entrainment, (4) steady-state flow boiling, (5) integral economizer once-through steam generator (IEOTSG) performance, (6) bottom reflood, and (7) two-dimensional phase separation of two-phase mixtures were simulated with TRAC-PD2. In addition, the early part of an overcooling transient which occurred at the Rancho Seco nuclear power plant on March 20, 1978 was also computed with an updated version of TRAC-PD2. Three separate-effects tests dealing with (1) transient critical flow, (2) steady-state flow boiling, and (3) IEOTSG performance were also simulated with RELAP5/MOD1 code. Comparisons between the code predictions and the test data are presented.

  20. Independent neural coding of reward and movement by pedunculopontine tegmental nucleus neurons in freely navigating rats

    PubMed Central

    Norton, Alix B.W.; Jo, Yong Sang; Clark, Emily W.; Taylor, Cortney A.; Mizumori, Sheri J.Y.

    2011-01-01

    Phasic firing of dopamine (DA) neurons in the ventral tegmental area (VTA) and substantia nigra (SN) is likely crucial for reward processing that guides learning. One of the key structures implicated in the regulation of this DA burst firing is the pedunculopontine tegmental nucleus (PPTg) which projects to both VTA and SN. Different literatures suggest that PPTg serves as a sensory-gating area for DA cells or it regulates voluntary movement. This study recorded PPTg single unit activity as rats perform a spatial navigation task to examine the potential for both reward and movement contributions. PPTg cells showed significant changes in firing relative to reward acquisition, the velocity of movement across the maze, and turning behaviors of the rats. Reward, but not movement, correlates were impacted by changes in context, and neither correlate type was affected by reward manipulations (e.g. changing the expected location of a reward). This suggests that the PPTg conjunctively codes both reward and behavioral information, and that the reward information is processed in a context-dependent manner. The distinct anatomical distribution of reward and movement cells emphasizes different models of synaptic control by PPTg of DA burst firing in VTA and SN. Relevant to both VTA and SN learning systems, however, PPTg appears to serve as a sensory gating mechanism to facilitate reinforcement learning, while at the same time provides reinforcement-based guidance of ongoing goal-directed behaviors. PMID:21395868

  1. Investigations with methanobacteria and with evolution of the genetic code

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1986-01-01

    Mycoplasma capricolum was found by Osawa et al. to use UGA as the code of tryptophan and to contain 75% A + T in its DNA. This change could have been from evolutionary pressure to replace C + G by A + T. Numerous studies have been reported of evolution of proteins as measured by amino acid replacements that are observed when homologus proteins, such as hemoglobins from various vertebrates, are compared. These replacements result from nucleotide substitutions in amino acid codons in the corresponding genes. Simultaneously, silent nucleotide substitutions take place that can be studied when sequences of the genes are compared. These silent evolutionary changes take place mostly in third positions of codons. Two types of nucleotide substitutions are recognized: pyrimidine-pyrimidine and purine-purine interchanges (transitions) and pyriidine-purine interchanges (transversions). Silent transitions are favored when a corresponding transversion would produce an amino acid replacement. Conversely, silent transversions are favored by probability when transitions and transversions will both be silent. Extensive examples of these situations have been found in protein genes, and it is evident that transversions in silent positions predominate in family boxes in most of the examples studied. In associated research a streptomycete from cow manure was found to produce an extracellular enzyme capable of lysing the pseudomurein-contining methanogen Methanobacterium formicicum.

  2. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at…

  3. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    PubMed

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed. PMID:25801664

  4. An Early Underwater Artificial Vision Model in Ocean Investigations via Independent Component Analysis

    PubMed Central

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  5. An early underwater artificial vision model in ocean investigations via independent component analysis.

    PubMed

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  6. Investigation of the Use of Erasures in a Concatenated Coding Scheme

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Marriott, Philip J.

    1997-01-01

    A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

  7. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    SciTech Connect

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-15

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  8. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  9. A computational fluid dynamics code for the investigation of ramjet-in-tube concepts

    NASA Astrophysics Data System (ADS)

    Bogdanoff, D. W.; Brackett, D. C.

    1987-06-01

    An inviscid computational fluid dynamics (CFD) code is presented which can handle multiple component species, simple chemical reactions, a completely general equation of state and velocities up to hundreds of km/sec. The code can also handle mutilple moving zones containing different media. Radiation effects are not included. The code uses third order spatial extrapolation/interpolation of the primitive variables to determine cell boundary values, applies limiting procedures to these values to maintain code stability and accuracy, and then uses Godunov procedures to calculate the cell boundary fluxes. The code numerical methods are presented in some detail and the results of benchmark test cases used to proof out the code are given. The agreement between the CFD and exact analytical calculations is found to be excellent. The code is used to investigate a ramjet-in-tube concept. In this concept, a projectile flies down a tube filled with combustible gas mixtures. The mixtures studied are H2 plus O2 plus excess H2 or N2 or CO2 as diluent. The projectile velocity range is 4 to 10 km/sec. Efficiencies up to 0.26 and ratios of effective projectile thrust pressure to maximum cycle pressure up to 0.12 are obtained. Plots of the pressure field around the projectile are presented.

  10. Final report of the independent counsel for Iran/Contra matters. Volume 1: Investigations and prosecutions

    SciTech Connect

    Walsh, L.E.

    1993-08-04

    In October and November 1986, two secret U.S. Government operations were publicly exposed, potentially implicating Reagan Administration officials in illegal activities. These operations were the provision of assistance to the military activities of the Nicaraguan contra rebels during an October 1984 to October 1986 prohibition on such aid, and the sale of U.S. arms to Iran in contravention of stated U.S. policy and in possible violation of arms-export controls. In late November 1986, Reagan Administration officials announced that some of the proceeds from the sale of U.S. arms to Iran had been diverted to the contras. As a result of the exposure of these operations, Attorney General Edwin Meese III sought the appointment of an independent counsel to investigate and, if necessary, prosecute possible crimes arising from them. This is the final report of that investigation.

  11. Registered report: Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs

    PubMed Central

    Phelps, Mitch; Coss, Chris; Wang, Hongyan; Cook, Matthew

    2016-01-01

    The Reproducibility Project: Cancer Biology seeks to address growing concerns about reproducibility in scientific research by conducting replications of selected experiments from a number of high-profile papers in the field of cancer biology. The papers, which were published between 2010 and 2012, were selected on the basis of citations and Altmetric scores (Errington et al., 2014). This Registered Report describes the proposed replication plan of key experiments from “Coding-Independent Regulation of the Tumor Suppressor PTEN by Competing Endogenous 'mRNAs' by Tay and colleagues, published in Cell in 2011 (Tay et al., 2011). The experiments to be replicated are those reported in Figures 3C, 3D, 3G, 3H, 5A and 5B, and in Supplemental Figures 3A and B. Tay and colleagues proposed a new regulatory mechanism based on competing endogenous RNAs (ceRNAs), which regulate target genes by competitive binding of shared microRNAs. They test their model by identifying and confirming ceRNAs that target PTEN. In Figure 3A and B, they report that perturbing expression of putative PTEN ceRNAs affects expression of PTEN. This effect is dependent on functional microRNA machinery (Figure 3G and H), and affects the pathway downstream of PTEN itself (Figures 5A and B). The Reproducibility Project: Cancer Biology is a collaboration between the Center for Open Science and Science Exchange, and the results of the replications will be published by eLife. DOI: http://dx.doi.org/10.7554/eLife.12470.001 PMID:26943900

  12. A Coding System with Independent Annotations of Gesture Forms and Functions during Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

    PubMed Central

    Kong, Anthony Pak-Hin; Law, Sam-Po; Kwan, Connie Ching-Yin; Lai, Christy; Lam, Vivian

    2014-01-01

    Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance. PMID:25667563

  13. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  14. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  15. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  16. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  17. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  18. Your ticket to independence: a guide to getting your first principal investigator position.

    PubMed

    Káradóttir, Ragnhildur Thóra; Letzkus, Johannes J; Mameli, Manuel; Ribeiro, Carlos

    2015-10-01

    The transition to scientific independence as a principal investigator (PI) can seem like a daunting and mysterious process to postdocs and students - something that many aspire to while at the same time wondering how to achieve this goal and what being a PI really entails. The FENS Kavli Network of Excellence (FKNE) is a group of young faculty who have recently completed this step in various fields of neuroscience across Europe. In a series of opinion pieces from FKNE scholars, we aim to demystify this process and to offer the next generation of up-and-coming PIs some advice and personal perspectives on the transition to independence, starting here with guidance on how to get hired to your first PI position. Rather than providing an exhaustive overview of all facets of the hiring process, we focus on a few key aspects that we have learned to appreciate in the quest for our own labs: What makes a research programme exciting and successful? How can you identify great places to apply to and make sure your application stands out? What are the key objectives for the job talk and the interview? How do you negotiate your position? And finally, how do you decide on a host institute that lets you develop both scientifically and personally in your new role as head of a lab? PMID:26286226

  19. Investigation of coding techniques for memory and delay efficient interleaving in slow Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Strater, Jay W.

    1991-11-01

    High data rate communication links operating under slow fading channel conditions may have interleaving memory requirements which are too large for practical applications. These requirements can be reduced by employing spacial diversity; however, a less costly alternative is to select coding and interleaving techniques that support memory efficient interleaving. The objective of this investigation has been to find coding and interleaving techniques with relatively small interleaving memory requirements and to accurately quantify these requirements. Toward this objective, convolutional and Ree-Solomon coding with single-stage and concatenated code configurations were evaluated with convolutional interleaving and differential phase shift keying (DPSK) modulation to determine their interleaving memory requirements. Code performance for these link selections was computed by high-fidelity link simulations and approximations over a wide range of E(sub b)/N(0) and interleaver span to scintillation decorrelation times (T(sub il)/Tau(0)) and the results of these evaluations were converted to interleaving memory requirements. Interleaving delay requirements were also determined and code selections with low interleaving memory and delay requirements were identified.

  20. A model to investigate the mechanisms underlying the emergence and development of independent sitting.

    PubMed

    O'Brien, Kathleen M; Zhang, Jing; Walley, Philip R; Rhoads, Jeffrey F; Haddad, Jeffrey M; Claxton, Laura J

    2015-07-01

    When infants first begin to sit independently, they are highly unstable and unable to maintain upright sitting posture for more than a few seconds. Over the course of 3 months, the sitting ability of infants drastically improves. To investigate the mechanisms controlling the development of sitting posture, a single-degree-of-freedom inverted pendulum model was developed. Passive muscle properties were modeled with a stiffness and damping term, while active neurological control was modeled with a time-delayed proportional-integral-derivative (PID) controller. The findings of the simulations suggest that infants primarily utilize passive muscle stiffness to remain upright when they first begin to sit. This passive control mechanism allows the infant to remain upright so that active feedback control mechanisms can develop. The emergence of active control mechanisms allows infants to integrate sensory information into their movements so that they can exhibit more adaptive sitting. PMID:25442426

  1. Code-Switching in Iranian Elementary EFL Classrooms: An Exploratory Investigation

    ERIC Educational Resources Information Center

    Rezvani, Ehsan; Street, Hezar Jerib; Rasekh, Abbass Eslami

    2011-01-01

    This paper presents the results of a small-scale exploratory investigation of code-switching (CS) between English and Farsi by 4 Iranian English foreign language (EFL) teachers in elementary level EFL classrooms in a language school in Isfahan, Iran. Specifically, the present study aimed at exploring the syntactical identification of switches and…

  2. 78 FR 37571 - Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... COMMISSION Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code AGENCY: U.S... importation, and the sale within the United States after importation of certain opaque polymers by reason of... importation, or the sale within the United States after importation of certain opaque polymers that...

  3. Semantic association investigated with fMRI and independent component analysis

    PubMed Central

    Kim, Kwang Ki; Karunanayaka, Prasanna; Privitera, Michael D.; Holland, Scott K.; Szaflarski, Jerzy P.

    2010-01-01

    Semantic association, an essential element of human language, enables discourse and inference. Neuroimaging studies have revealed localization and lateralization of semantic circuitry making substantial contributions to cognitive neuroscience. However, due to methodological limitations, these investigations have only identified individual functional components rather than capturing the behavior of the entire network. To overcome these limitations, we have implemented group independent component analysis (ICA) to investigate the cognitive modules used by healthy adults performing fMRI semantic decision task. When compared to the results of a standard GLM analysis, ICA detected several additional brain regions subserving semantic decision. Eight task-related group ICA maps were identified including left inferior frontal gyrus (BA44/45), middle posterior temporal gyrus (BA39/22), angular gyrus/inferior parietal lobule (BA39/40), posterior cingulate (BA30), bilateral lingual gyrus (BA18/23), inferior frontal gyrus (L>R, BA47), hippocampus with parahippocampal gyrus (L>R, BA35/36) and anterior cingulate (BA32/24). While most of the components were represented bilaterally, we found a single, highly left-lateralized component that included the inferior frontal gyrus and the medial and superior temporal gyri, the angular and supramarginal gyri and the inferior parietal cortex. The presence of these spatially independent ICA components implies functional connectivity and can be equated with their modularity. These results are analyzed and presented in the framework of a biologically plausible theoretical model in preparation for similar analyses in patients with right- or left-hemispheric epilepsies. PMID:21296027

  4. Investigating the Magnetorotational Instability with Dedalus, and Open-Souce Hydrodynamics Code

    SciTech Connect

    Burns, Keaton J; /UC, Berkeley, aff SLAC

    2012-08-31

    The magnetorotational instability is a fluid instability that causes the onset of turbulence in discs with poloidal magnetic fields. It is believed to be an important mechanism in the physics of accretion discs, namely in its ability to transport angular momentum outward. A similar instability arising in systems with a helical magnetic field may be easier to produce in laboratory experiments using liquid sodium, but the applicability of this phenomenon to astrophysical discs is unclear. To explore and compare the properties of these standard and helical magnetorotational instabilities (MRI and HRMI, respectively), magnetohydrodynamic (MHD) capabilities were added to Dedalus, an open-source hydrodynamics simulator. Dedalus is a Python-based pseudospectral code that uses external libraries and parallelization with the goal of achieving speeds competitive with codes implemented in lower-level languages. This paper will outline the MHD equations as implemented in Dedalus, the steps taken to improve the performance of the code, and the status of MRI investigations using Dedalus.

  5. Investigation of Coded Source Neutron Imaging at the North Carolina State University PULSTAR Reactor

    SciTech Connect

    Xiao, Ziyu; Mishra, Kaushal; Hawari, Ayman; Bingham, Philip R; Bilheux, Hassina Z; Tobin Jr, Kenneth William

    2010-01-01

    A neutron imaging facility is located on beam-tube #5 of the 1-MWth PULSTAR reactor at the North Carolina State University. An investigation has been initiated to explore the application of coded imaging techniques at the facility. Coded imaging uses a mosaic of pinholes to encode an aperture, thus generating an encoded image of the object at the detector. To reconstruct the image recorded by the detector, corresponding decoding patterns are used. The optimized design of coded masks is critical for the performance of this technique and will depend on the characteristics of the imaging beam. In this work, Monte Carlo (MCNP) simulations were utilized to explore the needed modifications to the PULSTAR thermal neutron beam to support coded imaging techniques. In addition, an assessment of coded mask design has been performed. The simulations indicated that a 12 inch single crystal sapphire filter is suited for such an application at the PULSTAR beam in terms of maximizing flux with good neutron-to-gamma ratio. Computational simulations demonstrate the feasibility of correlation reconstruction methods on neutron transmission imaging. A gadolinium aperture with thickness of 500 m was used to construct the mask using a 38 34 URA pattern. A test experiment using such a URA design has been conducted and the point spread function of the system has been measured.

  6. ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS

    SciTech Connect

    Not Available

    1992-08-01

    There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

  7. Coding for stable transmission of W-band radio-over-fiber system using direct-beating of two independent lasers.

    PubMed

    Yang, L G; Sung, J Y; Chow, C W; Yeh, C H; Cheng, K T; Shi, J W; Pan, C L

    2014-10-20

    We demonstrate experimentally Manchester (MC) coding based W-band (75 - 110 GHz) radio-over-fiber (ROF) system to reduce the low-frequency-components (LFCs) signal distortion generated by two independent low-cost lasers using spectral shaping. Hence, a low-cost and higher performance W-band ROF system is achieved. In this system, direct-beating of two independent low-cost CW lasers without frequency tracking circuit (FTC) is used to generate the millimeter-wave. Approaches, such as delayed self-heterodyne interferometer and heterodyne beating are performed to characterize the optical-beating-interference sub-terahertz signal (OBIS). Furthermore, W-band ROF systems using MC coding and NRZ-OOK are compared and discussed. PMID:25401641

  8. RACE, CODE OF THE STREET, AND VIOLENT DELINQUENCY: A MULTILEVEL INVESTIGATION OF NEIGHBORHOOD STREET CULTURE AND INDIVIDUAL NORMS OF VIOLENCE*

    PubMed Central

    Stewart, Eric A.; Simons, Ronald L.

    2011-01-01

    The study outlined in this article drew on Elijah Anderson’s (1999) code of the street perspective to examine the impact of neighborhood street culture on violent delinquency. Using data from more than 700 African American adolescents, we examined 1) whether neighborhood street culture predicts adolescent violence above and beyond an adolescent’s own street code values and 2) whether neighborhood street culture moderates individual-level street code values on adolescent violence. Consistent with Anderson’s hypotheses, neighborhood street culture significantly predicts violent delinquency independent of individual-level street code effects. Additionally, neighborhood street culture moderates individual-level street code values on violence in neighborhoods where the street culture is widespread. In particular, the effect of street code values on violence is enhanced in neighborhoods where the street culture is endorsed widely. PMID:21666759

  9. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  10. Investigation of blood mRNA biomarkers for suicidality in an independent sample

    PubMed Central

    Mullins, N; Hodgson, K; Tansey, K E; Perroud, N; Maier, W; Mors, O; Rietschel, M; Hauser, J; Henigsberg, N; Souery, D; Aitchison, K; Farmer, A; McGuffin, P; Breen, G; Uher, R; Lewis, C M

    2014-01-01

    Changes in the blood expression levels of SAT1, PTEN, MAP3K3 and MARCKS genes have been reported as biomarkers of high versus low suicidality state (Le-Niculescu et al.). Here, we investigate these expression biomarkers in the Genome-Based Therapeutic Drugs for Depression (GENDEP) study, of patients with major depressive disorder on a 12-week antidepressant treatment. Blood gene expression levels were available at baseline and week 8 for patients who experienced suicidal ideation during the study (n=20) versus those who did not (n=37). The analysis is well powered to detect the effect sizes reported in the original paper. Within either group, there was no significant change in the expression of these four genes over the course of the study, despite increasing suicidal ideation or initiation of antidepressant treatment. Comparison of the groups showed that the gene expression did not differ between patients with or without treatment-related suicidality. This independent study does not support the validity of the proposed biomarkers. PMID:25350297

  11. Investigation of blood mRNA biomarkers for suicidality in an independent sample.

    PubMed

    Mullins, N; Hodgson, K; Tansey, K E; Perroud, N; Maier, W; Mors, O; Rietschel, M; Hauser, J; Henigsberg, N; Souery, D; Aitchison, K; Farmer, A; McGuffin, P; Breen, G; Uher, R; Lewis, C M

    2014-01-01

    Changes in the blood expression levels of SAT1, PTEN, MAP3K3 and MARCKS genes have been reported as biomarkers of high versus low suicidality state (Le-Niculescu et al.). Here, we investigate these expression biomarkers in the Genome-Based Therapeutic Drugs for Depression (GENDEP) study, of patients with major depressive disorder on a 12-week antidepressant treatment. Blood gene expression levels were available at baseline and week 8 for patients who experienced suicidal ideation during the study (n=20) versus those who did not (n=37). The analysis is well powered to detect the effect sizes reported in the original paper. Within either group, there was no significant change in the expression of these four genes over the course of the study, despite increasing suicidal ideation or initiation of antidepressant treatment. Comparison of the groups showed that the gene expression did not differ between patients with or without treatment-related suicidality. This independent study does not support the validity of the proposed biomarkers. PMID:25350297

  12. Detailed investigation of Long-Period activity at Campi Flegrei by Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.

    2016-04-01

    This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of the source, which can be used, by means of the small-intensity precursors, for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the CICA method to regions with low seismicity in high ambient noise.

  13. Development of Electromagnetic Particle Simulation Code in an Open System for Investigation of Magnetic Reconnection

    NASA Astrophysics Data System (ADS)

    Ohtani, H.; Horiuchi, R.; Usami, S.

    2013-10-01

    In order to investigate magnetic reconnection from the microscopic viewpoint, we have developed a three-dimensional electromagnetic particle simulation code in an open system (PASMO). For performing the code on a distributed memory and multi-processor computer system with a distributed parallel algorithm, we distributed only information of particles and did not decompose the domain in the previous PASMO code. However, in the case that the memory size on one node of computer is limited, the previous code could not be performed for large-scale simulation because all field data were duplicated on each parallel process. In order to overcome this problem, we decompose the domain, in which the field variable defined by three coordinates is distributed. The processor performs the field solver in the mapped domain, and carries out the particle pusher for the particles which exist in the domain. In this paper, we develop the open boundary condition with the domain decomposition algorithm and perform more large-scale particle simulations. We will discuss the performance of the new PASMO and the simulation results on the magnetic reconnection. This work was supported by a Grant-in-Aid for Scientific Research from the Japan Society for the Promotion of Science (Grant No 23340182) and General Coordinated Research at NIFS (NIFS12KNSS027, NIFS13KNXN252).

  14. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry.

    PubMed

    Bernal, M A; Liendo, J A

    2009-02-01

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of gamma rays originated from Co60 and Cs137 isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs137, AlK and CK radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field. PMID:19292002

  15. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry

    SciTech Connect

    Bernal, M. A.; Liendo, J. A.

    2009-02-15

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of {gamma} rays originated from Co{sup 60} and Cs{sup 137} isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs{sup 137}, Al{sub K} and C{sub K} radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field.

  16. Investigation of in-band transmission of both spectral amplitude coding/optical code division multiple-access and wavelength division multiplexing signals

    NASA Astrophysics Data System (ADS)

    Ashour, Isaac A. M.; Shaari, Sahbudin; Shalaby, Hossam M. H.; Menon, P. Susthitha

    2011-06-01

    The transmission of both optical code division multiple-access (OCDMA) and wavelength division multiplexing (WDM) users on the same band is investigated. Code pulses of spectral amplitude coding (SAC)/optical code division multiple-access (CDMA) are overlaid onto a multichannel WDM system. Notch filters are utilized in order to suppress the WDM interference signals for detection of optical broadband CDMA signals. Modified quadratic congruence (MQC) codes are used as the signature codes for the SAC/OCDMA system. The proposed system is simulated and its performance in terms of both the bit-error rate and Q-factor are determined. In addition, eavesdropper probability of error-free code detection is evaluated. Our results are compared to traditional nonhybrid systems. It is concluded that the proposed hybrid scheme still achieves acceptable performance. In addition, it provides enhanced data confidentiality as compared to the scheme with SAC/OCDMA only. It is also shown that the performance of the proposed system is limited by the interference of the WDM signals. Furthermore, the simulation illustrates the tradeoff between the performance and confidentiality for authorized users.

  17. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  18. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  19. An Investigation of Independent Child Behavior in the Open Classroom: The Classroom Attitude Observation Schedule (CAOS).

    ERIC Educational Resources Information Center

    Goldupp, Ocea

    The Classroom Attitude Observation Schedule was developed and field tested for study of independent child behavior in the open classroom. Eight Head Start classrooms were used for field testing, six of which used the Tucson Early Education Model curriculum and two of which, for comparison, used local curricula. Procedures involved observing and…

  20. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    SciTech Connect

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    2015-01-01

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  1. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    SciTech Connect

    Pigni, M.T. Francis, M.W.; Gauld, I.C.

    2015-01-15

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {sup 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  2. Investigation on series of length of coding and non-coding DNA sequences of bacteria using multifractal detrended cross-correlation analysis.

    PubMed

    Stan, Cristina; Cristescu, Monica Teodora; Luiza, Buimaga Iarinca; Cristescu, C P

    2013-03-21

    In the framework of multifractal detrended cross-correlation analysis, we investigate characteristics of series of length of coding and non-coding DNA sequences of some bacteria and archaea. We propose the use of a multifractal cross-correlation series that can be defined for any pair of equal lengths data sequences (or time series) and that can be characterized by the full set of parameters that are attributed to any time series. Comparison between characteristics of series of length of coding and non-coding DNA sequences and of their associated multifractal cross-correlation series for selected groups is used for the identification of class affiliation of certain bacteria and archaea. The analysis is carried out using the dependence of the generalized Hurst exponent on the size of fluctuations, the shape of the singularity spectra, the shape and relative disposition of the curves of the singular measures scaling exponent and the values of the associated parameters. Empirically, we demonstrate that the series of lengths of coding and non-coding sequences as well as the associated multifractal cross-correlation series can be approximated as universal multifractals. PMID:23313335

  3. Establishing evidence of contact transfer in criminal investigation by a novel 'peptide coding' reagent.

    PubMed

    Gooch, James; Koh, Clarissa; Daniel, Barbara; Abbate, Vincenzo; Frascione, Nunzianda

    2015-11-01

    Forensic investigators are often faced with the challenge of forming a logical association between a suspect, object or location and a particular crime. This article documents the development of a novel reagent that may be used to establish evidence of physical contact between items and individuals as a result of criminal activity. Consisting of a fluorescent compound suspended within an oil-based medium, this reagent utilises the addition of short customisable peptide molecules of a specific known sequence as unique owner-registered 'codes'. This product may be applied onto goods or premises of criminal interest and subsequently transferred onto objects that contact target surfaces. Visualisation of the reagent is then achieved via fluorophore excitation, subsequently allowing rapid peptide recovery and analysis. Simple liquid-liquid extraction methods were devised to rapidly isolate the peptide from other reagent components prior to analysis by ESI-MS. PMID:26452928

  4. Nye County nuclear waste repository project office independent scientific investigations program. Summary annual report, May 1996--April 1997

    SciTech Connect

    1997-05-01

    This annual summary report, prepared by Multimedia Environmental Technology, Inc. (MET) on behalf of Nye County Nuclear Waste Project Office, summarizes the activities that were performed during the period from May 1, 1996 to April 30, 1997. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: (1) Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment. (2) Identifying areas not being addressed adequately by DOE Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues.

  5. Nye County Nuclear Waste Repository Project Office independent scientific investigations program annual report, May 1997--April 1998

    SciTech Connect

    1998-07-01

    This annual summary report, prepared by the Nye County Nuclear Waste Repository Project Office (NWRPO), summarizes the activities that were performed during the period from May 1, 1997 to April 30, 1998. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment; identifying areas not being addressed adequately by the Department of Energy (DOE). Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues. This report summarizes the results of monitoring from two boreholes and the Exploratory Studies Facility (ESF) tunnel that have been instrumented by Nye County since March and April of 1995. The preliminary data and interpretations presented in this report do not constitute and should not be considered as the official position of Nye County. The ISIP presently includes borehole and tunnel instrumentation, monitoring, data analysis, and numerical modeling activities to address the concerns of Nye County.

  6. A Monte Carlo Investigation of the Analysis of Variance Applied to Non-Independent Bernoulli Variates.

    ERIC Educational Resources Information Center

    Draper, John F., Jr.

    The applicability of the Analysis of Variance, ANOVA, procedures to the analysis of dichotomous repeated measure data is described. The design models for which data were simulated in this investigation were chosen to represent simple cases of two experimental situations: situation one, in which subjects' responses to a single randomly selected set…

  7. A model-independent investigation on quasi-degenerate neutrino mass models and their significance

    NASA Astrophysics Data System (ADS)

    Roy, Subhankar; Singh, N. Nimai

    2013-12-01

    The prediction of possible hierarchy of neutrino masses mostly depends on the model chosen. Dissociating the ?-? interchange symmetry from discrete flavor symmetry based models, makes the neutrino mass matrix less predictive and motivates one to seek the answer from different phenomenological frameworks. This insists on proper parametrization of the neutrino mass matrices concerning individual hierarchies. In this work, an attempt has been made to study the six different cases of quasi-degenerate (QDN) neutrino models with mass matrices, mLL? parametrized with two free parameters (?,?), standard Wolfenstein parameter (?) and input mass scale, m0˜0.08 eV. We start with a ?-? symmetric neutrino mass matrix followed by a correction from charged lepton sector. The parametrization emphasizes on the existence of four independent texture zero building blocks common to all the QDN models under ?-? symmetric framework and is found to be invariant under any choice of solar angle. In our parametrization, solar angle is controlled from neutrino sector whereas the charged lepton sector drives the reactor and atmospheric mixing angles. The individual models are tested in the framework of oscillation experiments, cosmological observation and future experiments involving ?-decay and 0??? experiments, and any reason to discard the QDN mass models with relatively lower mass is unfounded. Although the QDNH-Type IA model shows strong preference for sin2?12=0.32, yet this is not sufficient to rule out the other models. The present work leaves a scope to extend the search of most favorable QDN mass model from observed baryon asymmetry of the Universe.

  8. Investigating the use of quick response codes in the gross anatomy laboratory.

    PubMed

    Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n?=?155) and doctor of physical therapy (n?=?39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P?=?0.113), and no significant gains in performance (P?=?0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. PMID:25288343

  9. Independent forensic autopsies in an armed conflict: investigation of the victims from Racak, Kosovo.

    PubMed

    Rainio, J; Lalu, K; Penttilä, A

    2001-02-15

    In January 1999, a team of Finnish forensic experts under the mandate of the European Union (EU forensic expert team, EU-FET) performed forensic investigations in a sovereign state, in Kosovo, the Federal Republic of Yugoslavia (FRY). The team served as a neutral participant in the forensic investigation of victims of an incident at Racak, which was receiving considerable international attention. The Finnish team performed forensic autopsies, monitored forensic autopsies performed by local experts and verified findings of earlier executed autopsies. The victims had sustained varying numbers of gunshot wounds, which were established to be the cause of death. The manner of death remained undetermined by the EU-FET, because the scene investigation and the chain of custody for the bodies from the site of the incident to the autopsy were impossible to verify by the team. The events at Racak were the first of those leading to charges by the International Criminal Tribunal for the former Yugoslavia (ICTY) against the highest authorities in power in the FRY for crimes against humanity and violations of the laws or customs of war. PMID:11182269

  10. Further Investigation of Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2006-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in designing and assessing low-noise concepts. This work begins a systematic study to identify the areas of the design space in which propagation codes of varying fidelity may be used effectively to provide efficient design and assessment. An efficient lower-fidelity code is used in conjunction with two higher-fidelity, more computationally intensive methods to solve benchmark problems of increasing complexity. The codes represent a small sampling of the current propagation codes available or under development. Results of this initial study indicate that the lower-fidelity code provides satisfactory results for cases involving low to moderate attenuation rates, whereas, the two higher-fidelity codes perform well across the range of problems.

  11. Towards investigation of evolution of dynamical systems with independence of time accuracy: more classes of systems

    NASA Astrophysics Data System (ADS)

    Gurzadyan, V. G.; Kocharyan, A. A.

    2015-07-01

    The recently developed method (Paper 1) enabling one to investigate the evolution of dynamical systems with an accuracy not dependent on time is developed further. The classes of dynamical systems which can be studied by that method are much extended, now including systems that are: (1) non-Hamiltonian, conservative; (2) Hamiltonian with time-dependent perturbation; (3) non-conservative (with dissipation). These systems cover various types of N-body gravitating systems of astrophysical and cosmological interest, such as the orbital evolution of planets, minor planets, artificial satellites due to tidal, non-tidal perturbations and thermal thrust, evolving close binary stellar systems, and the dynamics of accretion disks.

  12. Investigation of low temperature solid oxide fuel cells for air-independent UUV applications

    NASA Astrophysics Data System (ADS)

    Moton, Jennie Mariko

    Unmanned underwater vehicles (UUVs) will benefit greatly from high energy density (> 500 Wh/L) power systems utilizing high-energy-density fuels and air-independent oxidizers. Current battery-based systems have limited energy densities (< 400 Wh/L), which motivate development of alternative power systems such as solid oxide fuel cells (SOFCs). SOFC-based power systems have the potential to achieve the required UUV energy densities, and the current study explores how SOFCs based on gadolinia-doped ceria (GDC) electrolytes with operating temperatures of 650°C and lower may operate in the unique environments of a promising UUV power plant. The plant would contain a H 2O2 decomposition reactor to supply humidified O2 to the SOFC cathode and exothermic aluminum/H2O combustor to provide heated humidified H2 fuel to the anode. To characterize low-temperature SOFC performance with these unique O2 and H2 source, SOFC button cells based on nickel/GDC (Gd0.1Ce0.9O 1.95) anodes, GDC electrolytes, and lanthanum strontium cobalt ferrite (La0.6Sr0.4Co0.2Fe0.8O3-? or LSCF)/GDC cathodes were fabricated and tested for performance and stability with humidity on both the anode and the cathode. Cells were also tested with various reactant concentrations of H2 and O2 to simulate gas depletion down the channel of an SOFC stack. Results showed that anode performance depended primarily on fuel concentration and less on the concentration of the associated increase in product H2O. O 2 depletion with humidified cathode flows also caused significant loss in cell current density at a given voltage. With the humidified flows in either the anode or cathode, stability tests of the button cells at 650 °C showed stable voltage is maintained at low operating current (0.17 A/cm2) at up to 50 % by mole H2O, but at higher current densities (0.34 A/cm2), irreversible voltage degradation occurred at rates of 0.8-3.7 mV/hour depending on exposure time. From these button cell results, estimated average current densities over the length of a low-temperature SOFC stack were estimated and used to size a UUV power system based on Al/H 2O oxidation for fuel and H2O2 decomposition for O2. The resulting system design suggested that energy densities above 300 Wh/L may be achieved at neutral buoyancy with seawater if the cell is operated at high reactant utilizations in the SOFC stack for missions longer than 20 hours.

  13. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student…

  14. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student…

  15. Retrospective investigation of gingival invaginations : Part I: Clinical findings and presentation of a coding system.

    PubMed

    Reichert, Christoph; Gölz, Lina; Dirk, Cornelius; Jäger, Andreas

    2012-08-01

    Many orthodontic treatments involve tooth extraction. Gingival invagination is a common side effect after orthodontic extraction space closure leading to compromised oral hygiene and the space closure being hampered. Even the long-term stability of the orthodontic treatment result may be jeopardized. The aim of this study was to identify risk factors for the development of gingival invagination and possible implications on oral health and orthodontic treatment results.A total of 30 patients presenting 101 tooth extractions and subsequent orthodontic space closure were investigated to detect the presence of gingival invagination. The time required until active space closure, the thoroughness of space closure, and probing depths mesial and distal to the extraction site in addition to age, gender and the Periodontal Screening Index were investigated. A new coding system to describe the extent of gingival invagination is introduced for the first time here.Gingival invagination developed more frequently in the lower jaw (50%) than the upper (30%). Complete penetration occurred in the upper jaw in 6% of the patients and in the lower jaw in 25%. All patients without gingival invagination revealed complete space closure, whereas only 70% in the group with gingival invagination did so. The time until initiation of space closure took significantly longer in patients with gingival invagination (7.5 ± 1.4 months) than in patients without (3.3 ± 0.8 months). Probing depths of the adjacent teeth were significantly greater in regions with invaginations.Thus, the time required until space closure was initiated and the extraction site are important risk factors for the development of gingival invagination. The consequences of gingival invagination are instable space closure and deeper probing depths mesial and distal to the extractions. However, no statements concerning the mid- to long-term effects on oral health can be made. PMID:22777163

  16. Training camp: The quest to become a new National Institutes of Health (NIH)-funded independent investigator

    NASA Astrophysics Data System (ADS)

    Sklare, Daniel A.

    2003-04-01

    This presentation will provide information on the research training and career development programs of the National Institute on Deafness and Other Communication Disorders (NIDCD). The predoctoral and postdoctoral fellowship (F30, F31, F32) programs and the research career development awards for clinically trained individuals (K08/K23) and for individuals trained in the quantitative sciences and in engineering (K25) will be highlighted. In addition, the role of the NIDCD Small Grant (R03) in transitioning postdoctoral-level investigators to research independence will be underscored.

  17. Agreement in participant-coded and investigator-coded food-record analysis in overweight research participants: an examination of interpretation bias.

    PubMed

    Bjorge-Schohl, Brooke; Johnston, Carol S; Trier, Catherine M; Fleming, Katie R

    2014-05-01

    Validation studies support the use of self-administered computerized methods for reporting energy intake; however, the degree of interpretation bias with these methods is unknown. This research compared nutrient intake for food records that were both participant coded (using the National Cancer Institute's Automated Self-Administered 24-hour recall [ASA24] online program) and investigator-coded (a single investigator coded all food records using the ESHA Food Processor diet analysis program). Participants (n=28; mean age=41±11 years; mean body mass index=31±6) were participants in an 8-week trial (conducted between March 2011 and June 2011 in Phoenix, AZ) investigating the impact of meal preloads on satiety. Food records were collected on four occasions during the trial and, of the food records available for this investigation (n=161), 88% were completed on a weekday. Intra-class correlation coefficients were computed for selected nutrients and ranged from 0.65 to 0.81 for the macronutrients and from 0.50 to 0.66 for the micronutrients (overall mean=0.67). Overall mean coefficient improved to 0.77 when the data from three or more food records per participant were averaged, as is commonly done in nutrition research. All intra-class correlation coefficients were significant (P<0.020) and were not impacted by the day of week that food was recorded. For energy, macronutrients, and minerals, the percent median differences between coders were <±17%; however, percent median differences were large for vitamin C (+27%) and beta carotene (+294%). Findings from this study suggest that self-administered dietary assessment has merit as a research tool. Pretrial training for research participants is suggested to reduce interpretation bias. PMID:24210517

  18. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    NASA Astrophysics Data System (ADS)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  19. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM.

    PubMed

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data. PMID:26005413

  20. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM

    PubMed Central

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data. PMID:26005413

  1. Computer models to support investigations of surface subsidence and associated ground motion induced by underground coal gasification. [STEALTH Codes

    SciTech Connect

    Langland, R.T.; Trent, B.C.

    1981-01-01

    Two computer codes compare surface subsidence induced by underground coal gasification at Hoe Creek, Wyoming, and Centralia, Washington. Calculations with the STEALTH explicit finite-difference code are shown to match equivalent, implicit finite-element method solutions for the removal of underground material. Effects of removing roof material, varying elastic constants, investigating thermal shrinkage, and burning multiple coal seams are studied. A coupled, finite-difference continuum rigid-block caving code is used to model underground opening behavior. Numerical techniques agree qualitatively with empirical studies but, so far, underpredict ground surface displacement. The two methods, numerical and empirical, are most effective when used together. It is recommended that the thermal characteristics of coal measure rock be investigated and that additional calculations be carried out to longer times so that cooling influences can be modeled.

  2. Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.

    1998-01-01

    A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

  3. Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database

    SciTech Connect

    Bogen, K.T.; Seilkop, S.

    1993-05-01

    Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

  4. Role asymmetry and code transmission in signaling games: an experimental and computational investigation.

    PubMed

    Moreno, Maggie; Baggio, Giosuè

    2015-07-01

    In signaling games, a sender has private access to a state of affairs and uses a signal to inform a receiver about that state. If no common association of signals and states is initially available, sender and receiver must coordinate to develop one. How do players divide coordination labor? We show experimentally that, if players switch roles at each communication round, coordination labor is shared. However, in games with fixed roles, coordination labor is divided: Receivers adjust their mappings more frequently, whereas senders maintain the initial code, which is transmitted to receivers and becomes the common code. In a series of computer simulations, player and role asymmetry as observed experimentally were accounted for by a model in which the receiver in the first signaling round has a higher chance of adjusting its code than its partner. From this basic division of labor among players, certain properties of role asymmetry, in particular correlations with game complexity, are seen to follow. PMID:25352016

  5. Analytical Investigation on Papr Reduction in OFDM Systems Using Golay Codes

    NASA Astrophysics Data System (ADS)

    Uppal, Sabhyata; Sharma, Sanjay; Singh, Hardeep

    2014-09-01

    Orthogonal frequency division multiplexing (OFDM) is a common technique in multi carrier communications. One of the major issues in developing OFDM is the high peak to average power ratio (PAPR). Golay sequences have been introduced to construct 16-QAM and 256-QAM (quadrature amplitude modulation) code for the orthogonal frequency division multiplexing (OFDM), reducing the peak-to-average power ratio. In this paper we have considered the use of coding to reduce the peakto- average power ratio (PAPR) for orthogonal frequency division multiplexing (OFDM) systems. By using QPSK Golay sequences, 16 and 256 QAM sequences with low PAPR are generated

  6. THE CODE OF THE STREET AND INMATE VIOLENCE: INVESTIGATING THE SALIENCE OF IMPORTED BELIEF SYSTEMS*

    PubMed Central

    MEARS, DANIEL P.; STEWART, ERIC A.; SIENNICK, SONJA E.; SIMONS, RONALD L.

    2013-01-01

    Scholars have long argued that inmate behaviors stem in part from cultural belief systems that they “import” with them into incarcerative settings. Even so, few empirical assessments have tested this argument directly. Drawing on theoretical accounts of one such set of beliefs—the code of the street—and on importation theory, we hypothesize that individuals who adhere more strongly to the street code will be more likely, once incarcerated, to engage in violent behavior and that this effect will be amplified by such incarceration experiences as disciplinary sanctions and gang involvement, as well as the lack of educational programming, religious programming, and family support. We test these hypotheses using unique data that include measures of the street code belief system and incarceration experiences. The results support the argument that the code of the street belief system affects inmate violence and that the effect is more pronounced among inmates who lack family support, experience disciplinary sanctions, and are gang involved. Implications of these findings are discussed. PMID:24068837

  7. Write to Read: Investigating the Reading-Writing Relationship of Code-Level Early Literacy Skills

    ERIC Educational Resources Information Center

    Jones, Cindy D.; Reutzel, D. Ray

    2015-01-01

    The purpose of this study was to examine whether the code-related features used in current methods of writing instruction in kindergarten classrooms transfer reading outcomes for kindergarten students. We randomly assigned kindergarten students to 3 instructional groups: a writing workshop group, an interactive writing group, and a control group.…

  8. Investigating the impact of the cielo cray XE6 architecture on scientific application codes.

    SciTech Connect

    Rajan, Mahesh; Barrett, Richard; Pedretti, Kevin Thomas Tauke; Doerfler, Douglas W.; Vaughan, Courtenay Thomas

    2010-12-01

    Cielo, a Cray XE6, is the Department of Energy NNSA Advanced Simulation and Computing (ASC) campaign's newest capability machine. Rated at 1.37 PFLOPS, it consists of 8,944 dual-socket oct-core AMD Magny-Cours compute nodes, linked using Cray's Gemini interconnect. Its primary mission objective is to enable a suite of the ASC applications implemented using MPI to scale to tens of thousands of cores. Cielo is an evolutionary improvement to a successful architecture previously available to many of our codes, thus enabling a basis for understanding the capabilities of this new architecture. Using three codes strategically important to the ASC campaign, and supplemented with some micro-benchmarks that expose the fundamental capabilities of the XE6, we report on the performance characteristics and capabilities of Cielo.

  9. An Investigation of Two Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, D. M.; Watson, W. R.; Jones, M. G.

    2005-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in studying low-noise designs. Recent years have seen the development of aeroacoustic propagation codes using various levels of approximation to obtain such a capability. In light of this, it is beneficial to pursue a design paradigm that incorporates the strengths of the various tools. The development of a quasi-3D methodology (Q3D-FEM) at NASA Langley has brought these ideas to mind in relation to the framework of the CDUCT-LaRC acoustic propagation and radiation tool. As more extensive three dimensional codes become available, it would seem appropriate to incorporate these tools into a framework similar to CDUCT-LaRC and use them in a complementary manner. This work focuses on such an approach in beginning the steps toward a systematic assessment of the errors, and hence the trade-offs, involved in the use of these codes. To illustrate this point, CDUCT-LaRC was used to study benchmark hardwall duct problems to quantify errors caused by wave propagation in directions far removed from that defined by the parabolic approximation. Configurations incorporating acoustic treatment were also studied with CDUCT-LaRC and Q3D-FEM. The cases presented show that acoustic treatment diminishes the effects of CDUCT-LaRC phase error as the solutions are attenuated. The results of the Q3D-FEM were very promising and matched the analytic solution very well. Overall, these tests were meant to serve as a step toward the systematic study of errors inherent in the propagation module of CDUCT-LaRC, as well as an initial test of the higher fidelity Q3D-FEM code.

  10. Investigation of the Fission Product Release From Molten Pools Under Oxidizing Conditions With the Code RELOS

    SciTech Connect

    Kleinhietpass, Ingo D.; Unger, Hermann; Wagner, Hermann-Josef; Koch, Marco K.

    2006-07-01

    With the purpose of modeling and calculating the core behavior during severe accidents in nuclear power plants system codes are under development worldwide. Modeling of radionuclide release and transport in the case of beyond design basis accidents is an integrated feature of the deterministic safety analysis of nuclear power plants. Following a hypothetical, uncontrolled temperature escalation in the core of light water reactors, significant parts of the core structures may degrade and melt down under formation of molten pools, leading to an accumulation of large amounts of radioactive materials. The possible release of radionuclides from the molten pool provides a potential contribution to the aerosol source term in the late phase of core degradation accidents. The relevance of the amount of transferred oxygen from the gas atmosphere into the molten pool on the specification of a radionuclide and its release depends strongly on the initial oxygen inventory. Particularly for a low oxygen potential in the melt as it is the case for stratification when a metallic phase forms the upper layer and, respectively, when the oxidation has proceeded so far so that zirconium was completely oxidized, a significant influence of atmospheric oxygen on the specification and the release of some radionuclides has to be anticipated. The code RELOS (Release of Low Volatile Fission Products from Molten Surfaces) is under development at the Department of Energy Systems and Energy Economics (formerly Department of Nuclear and New Energy Systems) of the Ruhr-University Bochum. It is based on a mechanistic model to describe the diffusive and convective transport of fission products from the surface of a molten pool into a cooler gas atmosphere. This paper presents the code RELOS, i. e. the features and abilities of the latest code version V2.3 and the new model improvements of V2.4 and the calculated results evaluating the implemented models which deal with the oxygen transfer from the liquid side of the phase boundary to the bulk of the melt by diffusion or by taking into account natural convection. Both models help to estimate the amount of oxygen entering into the liquid upper pool volume and being available for the oxidation reaction. For both models the metallic, the oxidic and a mixture phase can be taken into account when defining the composition of the upper pool volume. The influence of crust formation, i. e. the decrease of the liquid pool surface area is taken care of because it yields the relevant amount of fission products released into the atmosphere. The difference of the partial density between the gas side of the phase boundary and the bulk of the gas phase is the driving force of mass transport. (authors)

  11. Versatile code DLAYZ for investigating population kinetics and radiative properties of plasmas in non-local thermodynamic equilibrium

    NASA Astrophysics Data System (ADS)

    Gao, Cheng; Zeng, Jiaolong; Li, Yongqiang; Jin, Fengtao; Yuan, Jianmin

    2013-09-01

    A versatile code DLAYZ based on collisional-radiative model is developed for investigating the population kinetics and radiative properties of plasmas in non-local thermodynamic equilibrium. DLAYZ is implemented on the detailed level accounting (DLA) approach and can be extended to detailed configuration accounting (DCA) and hybrid DLA/DCA approaches. The code can treat both steady state and time-dependent problems. The implementation of the main modules of DLAYZ is discussed in detail including atomic data, rates, population distributions and radiative properties modules. The complete set of basic atomic data is obtained using relativistic quantum mechanics. For dense plasmas, the basic atomic data with plasma screening effects can be obtained. The populations are obtained by solving the coupled rate equations, which are used to calculate the radiative properties. A parallelized version is implemented in the code to treat the large-scale rate equations. Two illustrative examples of a steady state case for carbon plasmas and a time-dependent case for the relaxation of a K-shell excited argon are employed to show the main features of the present code.

  12. Safety Related Investigations of the VVER-1000 Reactor Type by the Coupled Code System TRACE/PARCS

    NASA Astrophysics Data System (ADS)

    Jaeger, Wadim; Espinoza, Victor Hugo Sánchez; Lischke, Wolfgang

    This study was performed at the Institute of Reactor Safety at the Forschungszentrum Karlsruhe. It is embedded in the ongoing investigations of the international code assessment and maintenance program (CAMP) for qualification and validation of system codes like TRACE(1) and PARCS(2). The chosen reactor type used to validate these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2(3) includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The post-test investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement with the measured data. The coolant mixing pattern, especially in the downcomer, has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provided good results compared to reference values and the ones of other participants of the benchmark. The results show that the developed three-dimensional nodalization of the reactor pressure vessel (RPV) is appropriate to describe the coolant mixing phenomena in the downcomer and the lower plenum of a VVER-1000 reactor. This phenomenon is a key issue for investigations of MSLB transient where the thermal hydraulics and the core neutronics are strongly linked. The simulation of the RPV and core behavior for postulated transients using the validated 3D TRACE RPV model, taking into account boundary conditions at vessel in- and outlet, indicates that the results are physically sound and in good agreement to other participant's results.

  13. Flight investigation of cockpit-displayed traffic information utilizing coded symbology in an advanced operational environment

    NASA Technical Reports Server (NTRS)

    Abbott, T. S.; Moen, G. C.; Person, L. H., Jr.; Keyser, G. L., Jr.; Yenni, K. R.; Garren, J. F., Jr.

    1980-01-01

    Traffic symbology was encoded to provide additional information concerning the traffic, which was displayed on the pilot's electronic horizontal situation indicators (EHSI). A research airplane representing an advanced operational environment was used to assess the benefit of coded traffic symbology in a realistic work-load environment. Traffic scenarios, involving both conflict-free and conflict situations, were employed. Subjective pilot commentary was obtained through the use of a questionnaire and extensive pilot debriefings. These results grouped conveniently under two categories: display factors and task performance. A major item under the display factor category was the problem of display clutter. The primary contributors to clutter were the use of large map-scale factors, the use of traffic data blocks, and the presentation of more than a few airplanes. In terms of task performance, the cockpit-displayed traffic information was found to provide excellent overall situation awareness. Additionally, mile separation prescribed during these tests.

  14. Theoretical models and simulation codes to investigate bystander effects and cellular communication at low doses

    NASA Astrophysics Data System (ADS)

    Ballarini, F.; Alloni, D.; Facoetti, A.; Mairani, A.; Nano, R.; Ottolenghi, A.

    Astronauts in space are continuously exposed to low doses of ionizing radiation from Galactic Cosmic Rays During the last ten years the effects of low radiation doses have been widely re-discussed following a large number of observations on the so-called non targeted effects in particular bystander effects The latter consist of induction of cytogenetic damage in cells not directly traversed by radiation most likely as a response to molecular messengers released by directly irradiated cells Bystander effects which are observed both for lethal endpoints e g clonogenic inactivation and apoptosis and for non-lethal ones e g mutations and neoplastic transformation tend to show non-linear dose responses This might have significant consequences in terms of low-dose risk which is generally calculated on the basis of the Linear No Threshold hypothesis Although the mechanisms underlying bystander effects are still largely unknown it is now clear that two types of cellular communication i e via gap junctions and or release of molecular messengers into the extracellular environment play a fundamental role Theoretical models and simulation codes can be of help in elucidating such mechanisms In the present paper we will review different available modelling approaches including one that is being developed at the University of Pavia The focus will be on the different assumptions adopted by the various authors and on the implications of such assumptions in terms of non-targeted radiobiological damage and more generally low-dose

  15. Investigating protein-coding sequence evolution with probabilistic codon substitution models.

    PubMed

    Anisimova, Maria; Kosiol, Carolin

    2009-02-01

    This review is motivated by the true explosion in the number of recent studies both developing and ameliorating probabilistic models of codon evolution. Traditionally parametric, the first codon models focused on estimating the effects of selective pressure on the protein via an explicit parameter in the maximum likelihood framework. Likelihood ratio tests of nested codon models armed the biologists with powerful tools, which provided unambiguous evidence for positive selection in real data. This, in turn, triggered a new wave of methodological developments. The new generation of models views the codon evolution process in a more sophisticated way, relaxing several mathematical assumptions. These models make a greater use of physicochemical amino acid properties, genetic code machinery, and the large amounts of data from the public domain. The overview of the most recent advances on modeling codon evolution is presented here, and a wide range of their applications to real data is discussed. On the downside, availability of a large variety of models, each accounting for various biological factors, increases the margin for misinterpretation; the biological meaning of certain parameters may vary among models, and model selection procedures also deserve greater attention. Solid understanding of the modeling assumptions and their applicability is essential for successful statistical data analysis. PMID:18922761

  16. Investigating What Undergraduate Students Know About Science: Results from Complementary Strategies to Code Open-Ended Responses

    NASA Astrophysics Data System (ADS)

    Tijerino, K.; Buxner, S.; Impey, C.; CATS

    2013-04-01

    This paper presents new findings from an ongoing study of undergraduate student science literacy. Using data drawn from a 22 year project and over 11,000 student responses, we present how students' word usage in open-ended responses relates to what it means to study something scientifically. Analysis of students' responses show that they easily use words commonly associated with science, such as hypothesis, study, method, test, and experiment; but do these responses use scientific words knowledgeably? As with many multifaceted disciplines, demonstration of comprehension varies. This paper presents three different ways that student responses have been coded to investigate their understanding of science; 1) differentiating quality of a response with a coding scheme; 2) using word counting as an indicator of overall response strength; 3) responses are coded for quality of students' response. Building on previous research, comparison of science literacy and open-ended responses demonstrates that knowledge of science facts and vocabulary does not indicate a comprehension of the concepts behind these facts and vocabulary. This study employs quantitative and qualitative methods to systematically determine frequency and meaning of responses to standardized questions, and illustrates how students are able to demonstrate a knowledge of vocabulary. However, this knowledge is not indicative of conceptual understanding and poses important questions about how we assess students' understandings of science.

  17. Coexistence of two different pseudohypoparathyroidism subtypes (Ia and Ib) in the same kindred with independent Gs? coding mutations and GNAS imprinting defects

    PubMed Central

    Lecumberri, B; Fernández-Rebollo, E; Sentchordi, L; Saavedra, P; Bernal-Chico, A; Pallardo, L F; Jiménez Bustos, J M; Castaño, L; de Santiago, M; Hiort, O; Pérez de Nanclares, G; Bastepe, M

    2011-01-01

    Background Pseudohypoparathyroidism (PHP) defines a rare group of disorders whose common feature is resistance to the parathyroid hormone. Patients with PHP-Ia display additional hormone resistance, Albright hereditary osteodystrophy (AHO) and reduced Gs? activity in easily accessible cells. This form of PHP is associated with heterozygous inactivating mutations in Gs?-coding exons of GNAS, an imprinted gene locus on chromosome 20q13.3. Patients with PHP-Ib typically have isolated parathyroid hormone resistance, lack AHO features and demonstrate normal erythrocyte Gs? activity. Instead of coding Gs? mutations, patients with PHP-Ib display imprinting defects of GNAS, caused, at least in some cases, by genetic mutations within or nearby this gene. Patients Two unrelated PHP families, each of which includes at least one patient with a Gs? coding mutation and another with GNAS loss of imprinting, are reported here. Results One of the patients with GNAS imprinting defects has paternal uniparental isodisomy of chromosome 20q, explaining the observed imprinting abnormalities. The identified Gs? coding mutations include a tetranucleotide deletion in exon 7, which is frequently found in PHP-Ia, and a novel single nucleotide change at the acceptor splice junction of intron 11. Conclusions These molecular data reveal an interesting mixture, in the same family, of both genetic and epigenetic mutations of the same gene. PMID:19858129

  18. Utilization of a Photon Transport Code to Investigate Radiation Therapy Treatment Planning Quantities and Techniques.

    NASA Astrophysics Data System (ADS)

    Palta, Jatinder Raj

    A versatile computer program MORSE, based on neutron and photon transport theory has been utilized to investigate radiation therapy treatment planning quantities and techniques. A multi-energy group representation of transport equation provides a concise approach in utilizing Monte Carlo numerical techniques to multiple radiation therapy treatment planning problems. A general three dimensional geometry is used to simulate radiation therapy treatment planning problems in configurations of an actual clinical setting. Central axis total and scattered dose distributions for homogeneous and inhomogeneous water phantoms are calculated and the correction factor for lung and bone inhomogeneities are also evaluated. Results show that Monte Carlo calculations based on multi-energy group transport theory predict the depth dose distributions that are in good agreement with available experimental data. Improved correction factors based on the concepts of lung-air-ratio and bone-air-ratio are proposed in lieu of the presently used correction factors that are based on tissue-air-ratio power law method for inhomogeneity corrections. Central axis depth dose distributions for a bremsstrahlung spectrum from a linear accelerator is also calculated to exhibit the versatility of the computer program in handling multiple radiation therapy problems. A novel approach is undertaken to study the dosimetric properties of brachytherapy sources. Dose rate constants for various radionuclides are calculated from the numerically generated dose rate versus source energy curves. Dose rates can also be generated for any point brachytherapy source with any arbitrary energy spectrum at various radial distances from this family of curves.

  19. SPARTAN: a simple performance assessment code for the Nevada Nuclear Waste Storage Investigations Project

    SciTech Connect

    Lin, Y.T.

    1985-12-01

    SPARTAN is a simple computer model designed for the Nevada Nuclear Waste Storage Investigations Project to calculate radionuclide transport in geologic media. The physical processes considered are limited to Darcy`s flow, radionuclide decay, and convective transport with constant retardation of radionuclides relative to water flow. Inputs for the model must be provided for the geometry, repository area, flow path, water flux, effective porosity, initial inventory, waste solubility, canister lifetime, and retardation factors. Results from the model consist of radionuclide release rates from the prospective Yucca Mountain repository for radioactive waste and cumulative curies released across the flow boundaries at the end of the flow path. The rates of release from the repository relative to NRC performance objectives and releases to the accessible environment relative to EPA requirements are also calculated. Two test problems compare the results of simulations from SPARTAN with analytical solutions. The comparisons show that the SPARTAN solution closely matches the analytical solutions across a range of conditions that approximate those that might occur at Yucca Mountain.

  20. Performance investigation of the pulse and Campbelling modes of a fission chamber using a Poisson pulse train simulation code

    NASA Astrophysics Data System (ADS)

    Elter, Zs.; Jammes, C.; Pázsit, I.; Pál, L.; Filliatre, P.

    2015-02-01

    The detectors of the neutron flux monitoring system of the foreseen French GEN-IV sodium-cooled fast reactor (SFR) will be high temperature fission chambers placed in the reactor vessel in the vicinity of the core. The operation of a fission chamber over a wide-range neutron flux will be feasible provided that the overlap of the applicability of its pulse and Campbelling operational modes is ensured. This paper addresses the question of the linearity of these two modes and it also presents our recent efforts to develop a specific code for the simulation of fission chamber pulse trains. Our developed simulation code is described and its overall verification is shown. An extensive quantitative investigation was performed to explore the applicability limits of these two standard modes. It was found that for short pulses the overlap between the pulse and Campbelling modes can be guaranteed if the standard deviation of the background noise is not higher than 5% of the pulse amplitude. It was also shown that the Campbelling mode is sensitive to parasitic noise, while the performance of the pulse mode is affected by the stochastic amplitude distributions.

  1. Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors

    NASA Astrophysics Data System (ADS)

    Lee, Y.-K.; Brun, E.

    2014-04-01

    The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 ± 17 pcm and 1208 ± 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

  2. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  3. Independent assessment of TRAC-PF1 (Version 7. 0), RELAP5/MOD1 (Cycle 14), and TRAC-BD1 (Version 12. 0) codes using separate-effects experiments

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G C; Yuelys-Miksis, C

    1985-08-01

    This report presents the results of independent code assessment conducted at BNL. The TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed using the critical flow tests, level swell test, countercurrent flow limitation (CCFL) tests, post-CHF test, steam generator thermal performance tests, and natural circulation tests. TRAC-BD1 (Version 12.0) was applied only to the CCFL and post-CHF tests. The TRAC-PWR series of codes, i.e., TRAC-P1A, TRAC-PD2, and TRAC-PF1, have been gradually improved. However, TRAC-PF1 appears to need improvement in almost all categories of tests/phenomena attempted to BNL. Of the two codes, TRAC-PF1 and RELAP5/MOD1, the latter needs more improvement particularly in the areas of: CCFL, Level swell, CHF correlation and post-CHF heat transfer, and Numerical stability. For the CCFL and post-CHF tests, TRAC-BD1 provides the best overall results. However, the TRAC-BD1 interfacial shear package for the countercurrent annular flow regime needs further improvement for better prediction of CCFL phenomenon. 47 refs., 87 figs., 15 tabs.

  4. Sequence heteroplasmy of D-loop and rRNA coding regions in mitochondrial DNA from Holstein cows of independent maternal lineages.

    PubMed

    Wu, J; Smith, R K; Freeman, A E; Beitz, D C; McDaniel, B T; Lindberg, G L

    2000-10-01

    A mitochondrial DNA (mtDNA) fragment containing the D-loop, phenylalanine tRNA, valine tRNA, and 12S and 16 rRNA genes was cloned and sequenced from 36 cows of 18 maternal lineages to identify the polymorphic sites within those regions and to detect the existence of heteroplasmic mtDNA in cows. Seventeen variable sites were observed within the D-loop and rRNA coding regions of bovine mtDNA within a 2.5-kb span. The hypervariable sites in the D-loop and rRNA coding regions were identified at nucleotide positions 169, 216, and 1594. Heteroplasmic mtDNA (variable mtDNA within a tissue) existed extensively in cows and was detected within the above regions in 11 of 36 cows sequenced. The insertion, deletion, and nucleotide transversion polymorphisms were found only in homopolymer regions. Heteroplasmy was observed frequently and seemingly is persistent in cattle. Though heteroplasmy was demonstrated, most lineages and mtDNA sites showed no heteroplasmy. PMID:11129526

  5. Industry and Occupation in the Electronic Health Record: An Investigation of the National Institute for Occupational Safety and Health Industry and Occupation Computerized Coding System

    PubMed Central

    2016-01-01

    Background Inclusion of information about a patient’s work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers’ compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for “industry” and “occupation” based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. Objective The objective of the study was to evaluate the intercoder reliability of NIOSH’s Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Methods Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Results Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the “high confidence” level and 49%-58% at the “medium confidence” level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are “substantial” at the 2-digit level, but only “fair” to “good” at the 4-digit level. Conclusions This work serves as a baseline for performance of NIOCCS by investigators in the field. Further field testing will clarify NIOCCS effectiveness in terms of ability to assign codes and coding accuracy and will clarify its value as inclusion of these occupational variables in the EHR is promoted. PMID:26878932

  6. Chromatographic separation and multicollection-ICPMS analysis of iron. Investigating mass-dependent and -independent isotope effects.

    PubMed

    Dauphas, Nicolas; Janney, Philip E; Mendybaev, Ruslan A; Wadhwa, Meenakshi; Richter, Frank M; Davis, Andrew M; van Zuilen, Mark; Hines, Rebekah; Foley, C Nicole

    2004-10-01

    A procedure was developed that allows precise determination of Fe isotopic composition. Purification of Fe was achieved by ion chromatography on AG1-X8 strongly basic anion-exchange resin. No isotopic fractionation is associated with column chemistry within 0.02 per thousand /amu at 2sigma. The isotopic composition was measured with a Micromass IsoProbe multicollection inductively coupled plasma hexapole mass spectrometer. The Fe isotopic composition of the Orgueil CI1 carbonaceous chondrite, which best approximates the solar composition, is indistinguishable from that of IRMM-014 (-0.005 +/- 0.017 per thousand /amu). The IRMM-014 reference material is therefore used for normalization of the isotopic ratios. The protocol for analyzing mass-dependent variations is validated by measuring geostandards (IF-G, DTS-2, BCR-2, AGV-2) and heavily fractionated Fe left after vacuum evaporation of molten wüstite (FeO) and solar (MgO-Al(2)O(3)-SiO(2)-CaO-FeO in chondritic proportions) compositions. It is shown that the isotopic composition of Fe during evaporation of FeO follows a Rayleigh distillation with a fractionation factor alpha equal to (m(1)/m(2)()1/2), where m(1) and m(2) are the masses of the considered isotopes. This agrees with earlier measurements and theoretical expectations. The isotopic composition of Fe left after vacuum evaporation of solar composition also follows a Rayleigh distillation but with a fractionation factor (1.013 22 +/- 0.000 67 for the (56)Fe/(54)Fe ratio) that is lower than the square root of the masses (1.018 35). The protocol for analyzing mass-independent variations is validated by measuring terrestrial rocks that are not expected to show departure from mass-dependent fractionation. After internal normalization of the (57)Fe/(54)Fe ratio, the isotopic composition of Fe can be measured accurately with precisions of 0.2epsilon and 0.5epsilon at 2sigma for (56)Fe/(54)Fe and (58)Fe/(54)Fe ratios, respectively (epsilon refers to relative variations in parts per 10 000). For (58)Fe, this precision is an order of magnitude better than what had been achieved before. The method is applied to rocks that could potentially exhibit mass-independent effects, meteorites and Archaean terrestrial samples. The isotopic composition of a 3.8-Ga-old banded iron formation from Isua (IF-G, Greenland), and quartz-pyroxene rocks from Akilia and Innersuartuut (GR91-26 and SM/GR/171770, Greenland) are normal within uncertainties. Similarly, the Orgueil (CI1), Allende (CV3.2), Eagle Station (ESPAL), Brenham (MGPAL), and Old Woman (IIAB) meteorites do not show any mass-independent effect. PMID:15456307

  7. Experimental investigation of a 10-percent-thick helicopter rotor airfoil section designed with a viscous transonic analysis code

    NASA Technical Reports Server (NTRS)

    Noonan, K. W.

    1981-01-01

    An investigation was conducted in the Langley 6- by 28-Inch Transonic Tunnel to determine the two dimensional aerodynamic characteristics of a 10-percent-thick helicopter rotor airfoil at Mach numbers from 0.33 to 0.87 and respective Reynolds numbers from 4.9 x 10 to the 6th to 9.8 x 10 to the 6th. This airfoil, designated the RC-10(N)-1, was also investigated at Reynolds numbers from 3.0 x 10 to the 6th to 7.3 x 10 to the 6th at respective Mach numbers of 0.33 to 0.83 for comparison wit the SC 1095 (with tab) airfoil. The RC-10(N)-1 airfoil was designed by the use of a viscous transonic analysis code. The results of the investigation indicate that the RC-10(N)-1 airfoil met all the design goals. At a Reynolds number of about 9.4 x 10 to the 6th the drag divergence Mach number at zero normal-force coefficient was 0.815 with a corresponding pitching-moment coefficient of zero. The drag divergence Mach number at a normal-force coefficient of 0.9 and a Reynolds number of about 8.0 x 10 to the 6th was 0.61. The drag divergence Mach number of this new airfoil was higher than that of the SC 1095 airfoil at normal-force coefficients above 0.3. Measurements in the same wind tunnel at comparable Reynolds numbers indicated that the maximum normal-force coefficient of the RC-10(N)-1 airfoil was higher than that of the NACA 0012 airfoil for Mach numbers above about 0.35 and was about the same as that of the SC 1095 airfoil for Mach numbers up to 0.5.

  8. "Sample-Independent" Item Parameters? An Investigation of the Stability of IRT Item Parameters Estimated from Small Data Sets.

    ERIC Educational Resources Information Center

    Sireci, Stephen G.

    Whether item response theory (IRT) is useful to the small-scale testing practitioner is examined. The stability of IRT item parameters is evaluated with respect to the classical item parameters (i.e., p-values, biserials) obtained from the same data set. Previous research investigating the effect of sample size on IRT parameter estimation has…

  9. Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii

    SciTech Connect

    Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey

    1991-07-01

    On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

  10. Independence Is.

    ERIC Educational Resources Information Center

    Stickney, Sharon

    This workbook is designed to help participants of the Independence Training Program (ITP) to achieve a definition of "independence." The program was developed for teenage girls. The process for developing the concept of independence consists of four steps. Step one instructs the participant to create an imaginary situation where she is completely…

  11. Evaluation of the rodent Hershberger bioassay on intact juvenile males--testing of coded chemicals and supplementary biochemical investigations.

    PubMed

    Freyberger, A; Schladt, L

    2009-08-01

    Under the auspices of the Organization for Economic Cooperation and Development (OECD) the Hershberger assay on juvenile intact male rats is being validated as a screen for compounds with anti-androgenic potential. We participated in the testing of coded chemicals. Compounds included the positive control flutamide (FLUT, 3 mg/kg), linuron (LIN, 10, 100 mg/kg), p,p'-DDE (16, 160 mg/kg), and two negative substances, 4-nonylphenol (NP, 160 mg/kg) and 2,4-dinitrophenol (DNP, 10 mg/kg). Compounds were administered for 10 consecutive days by gavage to testosterone propionate (TP, 1 mg/kgs.c.)-supplemented rats. Uncoding revealed these results: compared to vehicle controls, treatment with TP resulted in increased androgen-sensitive tissue (AST) weights of ventral prostate (VP), seminal vesicles (SV), levator ani and bulbocavernosus muscles (LABC), Cowper's glands, and epididymides, and in decreased testes weight. When assessing anti-androgenic potential in TP-supplemented rats, FLUT decreased all AST weights, and increased testes weight. p,p'-DDE at the high dose, decreased final body weight and all AST weights, whereas the low dose only affected SV weight. LIN slightly decreased final body weight and decreased absolute SV and LABC and relative SV weights only at the high dose. NP decreased final body weight and only absolute SV weights, DNP was ineffective. Investigations not requested by OECD included measurement of liver enzymes and revealed strong induction of testosterone-metabolizing and phase II conjugating enzymes by p,p'-DDE. Our findings suggest that in principle the juvenile intact male rat can be used in the Hershberger assay to screen for anti-androgenic potential thereby contributing to a refinement of the assay in terms of animal welfare. However, in our hands this animal model was somewhat less sensitive than the peripubertal castrated rat. Final conclusions, however, can only be drawn on the basis of all available validation data. Results obtained with the negative reference compound NP suggest that a treatment-related decrement in body weights may affect AST weights and represent a confounding factor when screening for anti-androgenic properties. Finally, p,p'-DDE may affect AST weights by several mechanisms including enhanced testosterone metabolism. PMID:19467291

  12. Inter-Sentential Patterns of Code-Switching: A Gender-Based Investigation of Male and Female EFL Teachers

    ERIC Educational Resources Information Center

    Gulzar, Malik Ajmal; Farooq, Muhammad Umar; Umer, Muhammad

    2013-01-01

    This article has sought to contribute to discussions concerning the value of inter-sentential patterns of code-switching (henceforth ISPCS) particularly in the context of EFL classrooms. Through a detailed analysis of recorded data produced in that context, distinctive features in the discourse were discerned which were associated with males' and…

  13. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  14. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  15. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the coding techniques are equally applicable to any voice signal whether or not it carries any intelligible information, as the term speech implies. Other terms that are commonly used are speech compression and voice compression since the fundamental idea behind speech coding is to reduce (compress) the transmission rate (or equivalently the bandwidth) And/or reduce storage requirements In this document the terms speech and voice shall be used interchangeably.

  16. Binary primitive alternant codes

    NASA Technical Reports Server (NTRS)

    Helgert, H. J.

    1975-01-01

    In this note we investigate the properties of two classes of binary primitive alternant codes that are generalizations of the primitive BCH codes. For these codes we establish certain equivalence and invariance relations and obtain values of d and d*, the minimum distances of the prime and dual codes.

  17. Investigating mitochondrial metabolism in contracting HL-1 cardiomyocytes following hypoxia and pharmacological HIF activation identifies HIF-dependent and independent mechanisms of regulation.

    PubMed

    Ambrose, Lucy J A; Abd-Jamil, Amira H; Gomes, Renata S M; Carter, Emma E; Carr, Carolyn A; Clarke, Kieran; Heather, Lisa C

    2014-11-01

    Hypoxia is a consequence of cardiac disease and downregulates mitochondrial metabolism, yet the molecular mechanisms through which this occurs in the heart are incompletely characterized. Therefore, we aimed to use a contracting HL-1 cardiomyocyte model to investigate the effects of hypoxia on mitochondrial metabolism. Cells were exposed to hypoxia (2% O2) for 6, 12, 24, and 48 hours to characterize the metabolic response. Cells were subsequently treated with the hypoxia inducible factor (HIF)-activating compound, dimethyloxalylglycine (DMOG), to determine whether hypoxia-induced mitochondrial changes were HIF dependent or independent, and to assess the suitability of this cultured cardiac cell line for cardiovascular pharmacological studies. Hypoxic cells had increased glycolysis after 24 hours, with glucose transporter 1 and lactate levels increased 5-fold and 15-fold, respectively. After 24 hours of hypoxia, mitochondrial networks were more fragmented but there was no change in citrate synthase activity, indicating that mitochondrial content was unchanged. Cellular oxygen consumption was 30% lower, accompanied by decreases in the enzymatic activities of electron transport chain (ETC) complexes I and IV, and aconitase by 81%, 96%, and 72%, relative to controls. Pharmacological HIF activation with DMOG decreased cellular oxygen consumption by 43%, coincident with decreases in the activities of aconitase and complex I by 26% and 30%, indicating that these adaptations were HIF mediated. In contrast, the hypoxia-mediated decrease in complex IV activity was not replicated by DMOG treatment, suggesting HIF-independent regulation of this complex. In conclusion, 24 hours of hypoxia increased anaerobic glycolysis and decreased mitochondrial respiration, which was associated with changes in ETC and tricarboxylic acid cycle enzyme activities in contracting HL-1 cells. Pharmacological HIF activation in this cardiac cell line allowed both HIF-dependent and independent mitochondrial metabolic changes to be identified. PMID:24607765

  18. An investigation for population maintenance mechanism in a miniature garden: genetic connectivity or independence of small islet populations of the Ryukyu five-lined skink.

    PubMed

    Kurita, Kazuki; Hikida, Tsutomu; Toda, Mamoru

    2014-01-01

    The Ryukyu five-lined skink (Plestiodon marginatus) is an island lizard that is even found in tiny islets with less than half a hectare of habitat area. We hypothesized that the island populations are maintained under frequent gene flow among the islands or independent of each other. To test our hypotheses, we investigated genetic structure of 21 populations from 11 land-bridge islands that were connected during the latest glacial age, and 4 isolated islands. Analyses using mitochondrial cytochrome b gene sequence (n = 67) and 10 microsatellite loci (n = 235) revealed moderate to high levels of genetic differentiation, existence of many private alleles/haplotypes in most islands, little contemporary migration, a positive correlation between genetic variability and island area, and a negative correlation between relatedness and island area. These evidences suggest a strong effect of independent genetic drift as opposed to gene flow, favoring the isolation hypothesis even in tiny islet populations. Isolation-by-distance effect was demonstrated and it became more prominent when the 4 isolated islands were excluded, suggesting that the pattern is a remnant of the land-bridge age. In a few island populations, however, the possibility of occasional overwater dispersals was partially supported and therefore could not be ruled out. PMID:25189776

  19. Polyphasic study of the spatial distribution of microorganisms in Mexican pozol, a fermented maize dough, demonstrates the need for cultivation-independent methods to investigate traditional fermentations.

    PubMed

    Ampe, F; ben Omar, N; Moizan, C; Wacher, C; Guyot, J P

    1999-12-01

    The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polyphasic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligonucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our results demonstrate that DGGE fingerprinting and rRNA quantification should allow workers to precisely and rapidly characterize the microbial assemblage in a spontaneous lactic acid fermented food. Lactic acid bacteria (LAB) accounted for 90 to 97% of the total active microflora; no streptococci were isolated, although members of the genus Streptococcus accounted for 25 to 50% of the microflora. Lactobacillus plantarum and Lactobacillus fermentum, together with members of the genera Leuconostoc and Weissella, were the other dominant organisms. The overall activity was more important at the periphery of a ball, where eucaryotes, enterobacteria, and bacterial exopolysacharide producers developed. Our results also showed that the metabolism of heterofermentative LAB was influenced in situ by the distribution of the LAB in the pozol ball, whereas homolactic fermentation was controlled primarily by sugar limitation. We propose that starch is first degraded by amylases from LAB and that the resulting sugars, together with the lactate produced, allow a secondary flora to develop in the presence of oxygen. Our results strongly suggest that cultivation-independent methods should be used to study traditional fermented foods. PMID:10584005

  20. Polyphasic Study of the Spatial Distribution of Microorganisms in Mexican Pozol, a Fermented Maize Dough, Demonstrates the Need for Cultivation-Independent Methods To Investigate Traditional Fermentations

    PubMed Central

    Ampe, Frédéric; ben Omar, Nabil; Moizan, Claire; Wacher, Carmen; Guyot, Jean-Pierre

    1999-01-01

    The distribution of microorganisms in pozol balls, a fermented maize dough, was investigated by a polyphasic approach in which we used both culture-dependent and culture-independent methods, including microbial enumeration, fermentation product analysis, quantification of microbial taxa with 16S rRNA-targeted oligonucleotide probes, determination of microbial fingerprints by denaturing gradient gel electrophoresis (DGGE), and 16S ribosomal DNA gene sequencing. Our results demonstrate that DGGE fingerprinting and rRNA quantification should allow workers to precisely and rapidly characterize the microbial assemblage in a spontaneous lactic acid fermented food. Lactic acid bacteria (LAB) accounted for 90 to 97% of the total active microflora; no streptococci were isolated, although members of the genus Streptococcus accounted for 25 to 50% of the microflora. Lactobacillus plantarum and Lactobacillus fermentum, together with members of the genera Leuconostoc and Weissella, were the other dominant organisms. The overall activity was more important at the periphery of a ball, where eucaryotes, enterobacteria, and bacterial exopolysacharide producers developed. Our results also showed that the metabolism of heterofermentative LAB was influenced in situ by the distribution of the LAB in the pozol ball, whereas homolactic fermentation was controlled primarily by sugar limitation. We propose that starch is first degraded by amylases from LAB and that the resulting sugars, together with the lactate produced, allow a secondary flora to develop in the presence of oxygen. Our results strongly suggest that cultivation-independent methods should be used to study traditional fermented foods. PMID:10584005

  1. Microbial diversity and dynamics throughout manufacturing and ripening of surface ripened semi-hard Danish Danbo cheeses investigated by culture-independent techniques.

    PubMed

    Ryssel, Mia; Johansen, Pernille; Al-Soud, Waleed Abu; Sørensen, Søren; Arneborg, Nils; Jespersen, Lene

    2015-12-23

    Microbial successions on the surface and in the interior of surface ripened semi-hard Danish Danbo cheeses were investigated by culture-dependent and -independent techniques. Culture-independent detection of microorganisms was obtained by denaturing gradient gel electrophoresis (DGGE) and pyrosequencing, using amplicons of 16S and 26S rRNA genes for prokaryotes and eukaryotes, respectively. With minor exceptions, the results from the culture-independent analyses correlated to the culture-dependent plating results. Even though the predominant microorganisms detected with the two culture-independent techniques correlated, a higher number of genera were detected by pyrosequencing compared to DGGE. Additionally, minor parts of the microbiota, i.e. comprising <10.0% of the operational taxonomic units (OTUs), were detected by pyrosequencing, resulting in more detailed information on the microbial succession. As expected, microbial profiles of the surface and the interior of the cheeses diverged. During cheese production pyrosequencing determined Lactococcus as the dominating genus on cheese surfaces, representing on average 94.7%±2.1% of the OTUs. At day 6 Lactococcus spp. declined to 10.0% of the OTUs, whereas Staphylococcus spp. went from 0.0% during cheese production to 75.5% of the OTUs at smearing. During ripening, i.e. from 4 to 18 weeks, Corynebacterium was the dominant genus on the cheese surface (55.1%±9.8% of the OTUs), with Staphylococcus (17.9%±11.2% of the OTUs) and Brevibacterium (10.4%±8.3% of the OTUs) being the second and third most abundant genera. Other detected bacterial genera included Clostridiisalibacter (5.0%±4.0% of the OTUs), as well as Pseudoclavibacter, Alkalibacterium and Marinilactibacillus, which represented <2% of the OTUs. At smearing, yeast counts were low with Debaryomyces being the dominant genus accounting for 46.5% of the OTUs. During ripening the yeast counts increased significantly with Debaryomyces being the predominant genus, on average accounting for 96.7%±4.1% of the OTUs. The interior of the cheeses was dominated by Lactococcus spp. comprising on average 93.9%±7.8% of the OTUs throughout the cheese processing. The microbial dynamics described at genus level in this study add to a comprehensive understanding of the complex microbiota existing especially on surface ripened semi-hard cheeses. PMID:26432602

  2. Parallelization of the SIR code for the investigation of small-scale features in the solar photosphere

    NASA Astrophysics Data System (ADS)

    Thonhofer, Stefan; Rubio, Luis R. Bellot; Utz, Dominik; Hanslmeier, Arnold; Jurçák, Jan

    2015-10-01

    Magnetic fields are one of the most important drivers of the highly dynamic processes that occur in the lower solar atmosphere. They span a broad range of sizes, from large- and intermediate-scale structures such as sunspots, pores and magnetic knots, down to the smallest magnetic elements observable with current telescopes. On small scales, magnetic flux tubes are often visible as Magnetic Bright Points (MBPs). Apart from simple V/I magnetograms, the most common method to deduce their magnetic properties is the inversion of spectropolarimetric data. Here we employ the SIR code for that purpose. SIR is a well-established tool that can derive not only the magnetic field vector and other atmospheric parameters (e.g., temperature, line-of-sight velocity), but also their stratifications with height, effectively producing 3-dimensional models of the lower solar atmosphere. In order to enhance the runtime performance and the usability of SIR we parallelized the existing code and standardized the input and output formats. This and other improvements make it feasible to invert extensive high-resolution data sets within a reasonable amount of computing time. An evaluation of the speedup of the parallel SIR code shows a substantial improvement in runtime.

  3. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  4. Environmental health and safety independent investigation of the in situ vitrification melt expulsion at the Oak Ridge National Laboratory, Oak Ridge, Tennessee

    SciTech Connect

    1996-07-01

    At about 6:12 pm, EDT on April 21, 1996, steam and molten material were expelled from Pit 1 in situ vitrification (ISV) project at the Oak Ridge National Laboratory (ORNL). At the request of the director of the Environmental Restoration (ER) Division, Department of Energy Oak Ridge Operations (DOE ORO), an independent investigation team was established on April 26, 1996. This team was tasked to determine the facts related to the ORNL Pit 1 melt expulsion event (MEE) in the areas of environment safety and health concerns such as the adequacy of the ISV safety systems; operational control restrictions; emergency response planning/execution; and readiness review, and report the investigation team findings within 45 days from the date of incident. These requirements were stated in the letter of appointment presented in Appendix A of this report. This investigation did not address the physical causes of the MEE. A separate investigation was conducted by ISV project personnel to determine the causes of the melt expulsion and the extent of the effects of this phenomenon. In response to this event, occurrence report ORO-LMES-X10ENVRES-1996-0006 (Appendix B) was filed. The investigation team did not address the occurrence reporting or event notification process. The project personnel (project team) examined the physical evidence at Pit 1 ISV site (e.g., the ejected melt material and the ISV hood), reviewed documents such as the site- specific health and safety plan (HASP), and interviewed personnel involved in the event and/or the project. A listing of the personnel interviewed and evidence reviewed is provided in Appendix C.

  5. Order Preserving Sparse Coding.

    PubMed

    Ni, Bingbing; Moulin, Pierre; Yan, Shuicheng

    2015-08-01

    In this paper, we investigate order-preserving sparse coding for classifying structured data whose atomic features possess ordering relationships. Examples include time sequences where individual frame-wise features are temporally ordered, as well as still images (landscape, street view, etc.) where different regions of the image are spatially ordered. Classification of these structured data is often tackled by first decomposing the input data into individual atomic features, then performing sparse coding or other processing for each atomic feature vector independently, and finally aggregating individual responses to classify the input data. However, this heuristic approach ignores the underlying order of the individual atomic features within the input data, and results in suboptimal discriminative capability. In this work, we introduce an order preserving regularizer which aims to preserve the ordering structure of the reconstruction coefficients within the sparse coding framework. An efficient Nesterov-type smooth approximation method is developed for optimization of the new regularization criterion, with theoretically guaranteed error bound. We perform extensive experiments for time series classification on a synthetic dataset, several machine learning benchmarks, and an RGB-D human activity dataset. We also report experiments for scene classification on a benchmark image dataset. The encoded representation is discriminative and robust, and our classifier outperforms state-of-the-art methods on these tasks. PMID:26352999

  6. Unfolding the color code

    NASA Astrophysics Data System (ADS)

    Kubica, Aleksander; Yoshida, Beni; Pastawski, Fernando

    2015-08-01

    The topological color code and the toric code are two leading candidates for realizing fault-tolerant quantum computation. Here we show that the color code on a d-dimensional closed manifold is equivalent to multiple decoupled copies of the d-dimensional toric code up to local unitary transformations and adding or removing ancilla qubits. Our result not only generalizes the proven equivalence for d = 2, but also provides an explicit recipe of how to decouple independent components of the color code, highlighting the importance of colorability in the construction of the code. Moreover, for the d-dimensional color code with d+1 boundaries of d+1 distinct colors, we find that the code is equivalent to multiple copies of the d-dimensional toric code which are attached along a (d-1)-dimensional boundary. In particular, for d = 2, we show that the (triangular) color code with boundaries is equivalent to the (folded) toric code with boundaries. We also find that the d-dimensional toric code admits logical non-Pauli gates from the dth level of the Clifford hierarchy, and thus saturates the bound by Bravyi and König. In particular, we show that the logical d-qubit control-Z gate can be fault-tolerantly implemented on the stack of d copies of the toric code by a local unitary transformation.

  7. Evaluation of EPICOR-II Resin/Liner lysimeter investigation data using ``MIXBATH`` a one-dimensional transport code

    SciTech Connect

    McConnell, J.W.; Rogers, R.D.; Brey, R.R.; Sullivan, T.M.

    1992-08-01

    The computer code MIXBATH has been applied to compare model predictions with six years of leachate collection data from five lysimeters located at Oak Ridge and five located at Argonne National Laboratories. The goal of this study was to critique the applicability of these data for use as a basis for the long-term prediction of release and transport of radionuclides contained in Portland type I-II cement and Dow vinyl ester-styrene waste forms loaded with EPICOR-II prefilter ion exchange resins. MIXBATH was useful in providing insight into information needs for long-term performance assessment. In most cases, the total activity released from the lysimeters over the test period was indistinguishable from background, indicating a need for longer-term data collection. In cases where there was both sufficient information available and activity released, MIXBATH was able to predict releases within an order of magnitude of those measured. Releases are extremely sensitive to the soil partition coefficient and waste form diffusion coefficient, and these were identified as the key data needs for long-term performance assessment.

  8. Evaluation of EPICOR-II Resin/Liner lysimeter investigation data using MIXBATH'' a one-dimensional transport code

    SciTech Connect

    McConnell, J.W.; Rogers, R.D. ); Brey, R.R. ); Sullivan, T.M. )

    1992-01-01

    The computer code MIXBATH has been applied to compare model predictions with six years of leachate collection data from five lysimeters located at Oak Ridge and five located at Argonne National Laboratories. The goal of this study was to critique the applicability of these data for use as a basis for the long-term prediction of release and transport of radionuclides contained in Portland type I-II cement and Dow vinyl ester-styrene waste forms loaded with EPICOR-II prefilter ion exchange resins. MIXBATH was useful in providing insight into information needs for long-term performance assessment. In most cases, the total activity released from the lysimeters over the test period was indistinguishable from background, indicating a need for longer-term data collection. In cases where there was both sufficient information available and activity released, MIXBATH was able to predict releases within an order of magnitude of those measured. Releases are extremely sensitive to the soil partition coefficient and waste form diffusion coefficient, and these were identified as the key data needs for long-term performance assessment.

  9. Stereo sequence coding

    NASA Astrophysics Data System (ADS)

    Jiang, Qin; Hayes, Monson H., III

    1998-01-01

    A stereo sequence coding algorithm is presented and evaluated in this paper. The left image stream is coded independently by an MPEG-type coding scheme. In the right image stream, only reference frames are coded by the subspace projection technique. The rest of frames in the right image stream are not coded and transmitted at the encoder; they are reconstructed from reference frames at the decoder. A frame estimation and interpolation technique is developed to exploit the great redundancy within stereo sequences to reconstruct some frames of the right image stream at the decoder. In the reconstructed frames, uncovered occlusions regions are filled by a disparity-based techniques. The intra coding and residual coding are based on subband coding techniques. The motion and disparity fields are estimated by block-based matching with a multiresolution structure, and coded by an entropy coding technique. Two stereo sequences are used to test our coding algorithm. Experimental results show that the frame estimation and interpolation technique works well perceptively and our stereo sequence coding scheme is effective to achieve high compression ratio.

  10. Independent Living.

    ERIC Educational Resources Information Center

    Nathanson, Jeanne H., Ed.

    1994-01-01

    This issue of "OSERS" addresses the subject of independent living of individuals with disabilities. The issue includes a message from Judith E. Heumann, the Assistant Secretary of the Office of Special Education and Rehabilitative Services (OSERS), and 10 papers. Papers have the following titles and authors: "Changes in the Rehabilitation Act of…

  11. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  12. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  13. Numerical investigations on pressurized AL-composite vessel response to hypervelocity impacts: Comparison between experimental works and a numerical code

    NASA Astrophysics Data System (ADS)

    Mespoulet, Jérôme; Plassard, Fabien; Hereil, Pierre-Louis

    2015-09-01

    Response of pressurized composite-Al vessels to hypervelocity impact of aluminum spheres have been numerically investigated to evaluate the influence of initial pressure on the vulnerability of these vessels. Investigated tanks are carbon-fiber overwrapped prestressed Al vessels. Explored internal air pressure ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from experiments (Xray radiographies, particle velocity measurement and post-mortem vessels) have been compared to numerical results given from LS-DYNA ALE-Lagrange-SPH full coupling models. Simulations exhibit an under estimation in term of debris cloud evolution and shock wave propagation in pressurized air but main modes of damage/rupture on the vessels given by simulations are coherent with post-mortem recovered vessels from experiments. First results of this numerical work are promising and further simulation investigations with additional experimental data will be done to increase the reliability of the simulation model. The final aim of this crossed work is to numerically explore a wide range of impact conditions (impact angle, projectile weight, impact velocity, initial pressure) that cannot be explore experimentally. Those whole results will define a rule of thumbs for the definition of a vulnerability analytical model for a given pressurized vessel.

  14. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships. PMID:15768716

  15. Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J.

    2012-05-10

    Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the calculations reveal that the compressor conditions are calculated to approach surge such that the need for a surge control system for each compressor is identified. Thus, it is demonstrated that the S-CO{sub 2} cycle can operate in the initial decay heat removal mode even with autonomous reactor control. Because external power is not needed to drive the compressors, the results show that the S-CO{sub 2} cycle can be used for initial decay heat removal for a lengthy interval in time in the absence of any off-site electrical power. The turbine provides sufficient power to drive the compressors. Combined with autonomous reactor control, this represents a significant safety advantage of the S-CO{sub 2} cycle by maintaining removal of the reactor power until the core decay heat falls to levels well below those for which the passive decay heat removal system is designed. The new control strategy is an alternative to a split-shaft layout involving separate power and compressor turbines which had previously been identified as a promising approach enabling heat removal from a SFR at low power levels. The current results indicate that the split-shaft configuration does not provide any significant benefits for the S-CO{sub 2} cycle over the current single-shaft layout with shaft speed control. It has been demonstrated that when connected to the grid the single-shaft cycle can effectively follow the load over the entire range. No compressor speed variation is needed while power is delivered to the grid. When the system is disconnected from the grid, the shaft speed can be changed as effectively as it would be with the split-shaft arrangement. In the split-shaft configuration, zero generator power means disconnection of the power turbine, such that the resulting system will be almost identical to the single-shaft arrangement. Without this advantage of the split-shaft configuration, the economic benefits of the single-shaft arrangement, provided by just one turbine and lower losses at the design point, are more important to the overall cycle performance. Therefore, the single-shaft configuration shall be retained as the reference arrangement for S-CO{sub 2} cycle power converter preconceptual designs. Improvements to the ANL Plant Dynamics Code have been carried out. The major code improvement is the introduction of a restart capability which simplifies investigation of control strategies for very long transients. Another code modification is transfer of the entire code to a new Intel Fortran complier; the execution of the code using the new compiler was verified by demonstrating that the same results are obtained as when the previous Compaq Visual Fortran compiler was used.

  16. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    DOEpatents

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  17. An Investigation into Reliability of Knee Extension Muscle Strength Measurements, and into the Relationship between Muscle Strength and Means of Independent Mobility in the Ward: Examinations of Patients Who Underwent Femoral Neck Fracture Surgery.

    PubMed

    Katoh, Munenori; Kaneko, Yoshihiro

    2014-01-01

    [Purpose] The purpose of the present study was to investigate the reliability of isometric knee extension muscle strength measurement of patients who underwent femoral neck fracture surgery, as well as the relationship between independent mobility in the ward and knee muscle strength. [Subjects] The subjects were 75 patients who underwent femoral neck fracture surgery. [Methods] We used a hand-held dynamometer and a belt to measure isometric knee extension muscle strength three times, and used intraclass correlation coefficients (ICCs) to investigate the reliability of the measurements. We used a receiver operating characteristic curve to investigate the cutoff values for independent walking with walking sticks and non-independent mobility. [Results] ICCs (1, 1) were 0.9 or higher. The cutoff value for independent walking with walking sticks was 0.289 kgf/kg on the non-fractured side, 0.193 kgf/kg on the fractured side, and the average of both limbs was 0.238 kgf/kg. [Conclusion] We consider that the test-retest reliability of isometric knee extension muscle strength measurement of patients who have undergone femoral neck fracture surgery is high. We also consider that isometric knee extension muscle strength is useful for investigating means of independent mobility in the ward. PMID:24567667

  18. Finite-state codes

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Mceliece, Robert J.; Abdel-Ghaffar, Khaled

    1988-01-01

    A class of codes called finite-state (FS) codes is defined and investigated. The codes, which generalize both block and convolutional codes, are defined by their encoders, which are finite-state machines with parallel inputs and outputs. A family of upper bounds on the free distance of a given FS code is derived. A general construction for FS codes is given, and it is shown that in many cases the FS codes constructed in this way have a free distance that is the largest possible. Catastrophic error propagation (CEP) for FS codes is also discussed. It is found that to avoid CEP one must solve the graph-theoretic problem of finding a uniquely decodable edge labeling of the state diagram.

  19. Investigating the role of rare coding variability in Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP) in late-onset Alzheimer's disease.

    PubMed

    Sassi, Celeste; Guerreiro, Rita; Gibbs, Raphael; Ding, Jinhui; Lupton, Michelle K; Troakes, Claire; Al-Sarraj, Safa; Niblock, Michael; Gallo, Jean-Marc; Adnan, Jihad; Killick, Richard; Brown, Kristelle S; Medway, Christopher; Lord, Jenny; Turton, James; Bras, Jose; Morgan, Kevin; Powell, John F; Singleton, Andrew; Hardy, John

    2014-12-01

    The overlapping clinical and neuropathologic features between late-onset apparently sporadic Alzheimer's disease (LOAD), familial Alzheimer's disease (FAD), and other neurodegenerative dementias (frontotemporal dementia, corticobasal degeneration, progressive supranuclear palsy, and Creutzfeldt-Jakob disease) raise the question of whether shared genetic risk factors may explain the similar phenotype among these disparate disorders. To investigate this intriguing hypothesis, we analyzed rare coding variability in 6 Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP), in 141 LOAD patients and 179 elderly controls, neuropathologically proven, from the UK. In our cohort, 14 LOAD cases (10%) and 11 controls (6%) carry at least 1 rare variant in the genes studied. We report a novel variant in PSEN1 (p.I168T) and a rare variant in PSEN2 (p.A237V), absent in controls and both likely pathogenic. Our findings support previous studies, suggesting that (1) rare coding variability in PSEN1 and PSEN2 may influence the susceptibility for LOAD and (2) GRN, MAPT, and PRNP are not major contributors to LOAD. Thus, genetic screening is pivotal for the clinical differential diagnosis of these neurodegenerative dementias. PMID:25104557

  20. Investigating the role of rare coding variability in Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP) in late-onset Alzheimer's disease

    PubMed Central

    Sassi, Celeste; Guerreiro, Rita; Gibbs, Raphael; Ding, Jinhui; Lupton, Michelle K.; Troakes, Claire; Al-Sarraj, Safa; Niblock, Michael; Gallo, Jean-Marc; Adnan, Jihad; Killick, Richard; Brown, Kristelle S.; Medway, Christopher; Lord, Jenny; Turton, James; Bras, Jose; Morgan, Kevin; Powell, John F.; Singleton, Andrew; Hardy, John

    2014-01-01

    The overlapping clinical and neuropathologic features between late-onset apparently sporadic Alzheimer's disease (LOAD), familial Alzheimer's disease (FAD), and other neurodegenerative dementias (frontotemporal dementia, corticobasal degeneration, progressive supranuclear palsy, and Creutzfeldt-Jakob disease) raise the question of whether shared genetic risk factors may explain the similar phenotype among these disparate disorders. To investigate this intriguing hypothesis, we analyzed rare coding variability in 6 Mendelian dementia genes (APP, PSEN1, PSEN2, GRN, MAPT, and PRNP), in 141 LOAD patients and 179 elderly controls, neuropathologically proven, from the UK. In our cohort, 14 LOAD cases (10%) and 11 controls (6%) carry at least 1 rare variant in the genes studied. We report a novel variant in PSEN1 (p.I168T) and a rare variant in PSEN2 (p.A237V), absent in controls and both likely pathogenic. Our findings support previous studies, suggesting that (1) rare coding variability in PSEN1 and PSEN2 may influence the susceptibility for LOAD and (2) GRN, MAPT, and PRNP are not major contributors to LOAD. Thus, genetic screening is pivotal for the clinical differential diagnosis of these neurodegenerative dementias. PMID:25104557

  1. An Investigation of the Relationship of Intellective and Personality Variables to Success in an Independent Study Science Course Through the Use of a Modified Multiple Regression Model.

    ERIC Educational Resources Information Center

    Szabo, Michael; Feldhusen, John F.

    This is an empirical study of selected learner characteristics and their relation to academic success, as indicated by course grades, in a structured independent study learning program. This program, called the Audio-Tutorial System, was utilized in an undergraduate college course in the biological sciences. By use of multiple regression analysis,…

  2. Developing independence.

    PubMed

    Turnbull, A P; Turnbull, H R

    1985-03-01

    The transition from living a life as others want (dependence) to living it as the adolescent wants to live it (independence) is extraordinarily difficult for most teen-agers and their families. The difficulty is compounded in the case of adolescents with disabilities. They are often denied access to the same opportunities of life that are accessible to the nondisabled. They face special problems in augmenting their inherent capacities so that they can take fuller advantage of the accommodations that society makes in an effort to grant them access. In particular, they need training designed to increase their capacities to make, communicate, implement, and evaluate their own life-choices. The recommendations made in this paper are grounded in the long-standing tradition of parens patriae and enlightened paternalism; they seek to be deliberately and cautiously careful about the lives of adolescents with disabilities and their families. We based them on the recent tradition of anti-institutionalism and they are also consistent with some of the major policy directions of the past 15-20 years. These include: normalization, integration, and least-restrictive alternatives; the unity and integrity of the family; the importance of opportunities for self-advocacy; the role of consumer consent and choice in consumer-professional relationships; the need for individualized services; the importance of the developmental model as a basis for service delivery; the value of economic productivity of people with disabilities; and the rights of habilitation, amelioration, and prevention. PMID:3156827

  3. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    NASA Astrophysics Data System (ADS)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  4. True uniaxial compressive strengths of rock or coal specimens are independent of diameter-to-length ratios. Report of Investigations/1990

    SciTech Connect

    Babcock, C.O.

    1990-01-01

    Part of the compressive strength of a test specimen of rock or coal in the laboratory or a pillar in a mine comes from physical property strength and, in part, from the constraint provided by the loading stresses. Much confusion in pillar design comes from assigning the total strength change to geometry, as evidenced by the many pillar design equations with width to height as the primary variable. In tests by the U.S. Bureau of Mines, compressive strengths for cylindrical specimens of limestone, marble, sandstone, and coal were independent of the specimen test geometry when the end friction was removed. A conventional uniaxial compressive strength test between two steel platens is actually a uniaxial force and not a uniaxial stress test. The biaxial or triaxial state of stress for much of the test volume changes with the geometry of the test specimen. By removing the end friction supplied by the steel platens to the specimen, a more nearly uniaxial stress state independent of the specimen geometry is produced in the specimen. Pillar design is a constraint and physical property problem rather than a geometry problem. Roof and floor constraint are major factors in pillar design and strength.

  5. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  6. Investigation of plant control strategies for the supercritical C0{sub 2}Brayton cycle for a sodium-cooled fast reactor using the plant dynamics code.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J.

    2011-04-12

    The development of a control strategy for the supercritical CO{sub 2} (S-CO{sub 2}) Brayton cycle has been extended to the investigation of alternate control strategies for a Sodium-Cooled Fast Reactor (SFR) nuclear power plant incorporating a S-CO{sub 2} Brayton cycle power converter. The SFR assumed is the 400 MWe (1000 MWt) ABR-1000 preconceptual design incorporating metallic fuel. Three alternative idealized schemes for controlling the reactor side of the plant in combination with the existing automatic control strategy for the S-CO{sub 2} Brayton cycle are explored using the ANL Plant Dynamics Code together with the SAS4A/SASSYS-1 Liquid Metal Reactor (LMR) Analysis Code System coupled together using the iterative coupling formulation previously developed and implemented into the Plant Dynamics Code. The first option assumes that the reactor side can be ideally controlled through movement of control rods and changing the speeds of both the primary and intermediate coolant system sodium pumps such that the intermediate sodium flow rate and inlet temperature to the sodium-to-CO{sub 2} heat exchanger (RHX) remain unvarying while the intermediate sodium outlet temperature changes as the load demand from the electric grid changes and the S-CO{sub 2} cycle conditions adjust according to the S-CO{sub 2} cycle control strategy. For this option, the reactor plant follows an assumed change in load demand from 100 to 0 % nominal at 5 % reduction per minute in a suitable fashion. The second option allows the reactor core power and primary and intermediate coolant system sodium pump flow rates to change autonomously in response to the strong reactivity feedbacks of the metallic fueled core and assumed constant pump torques representing unchanging output from the pump electric motors. The plant behavior to the assumed load demand reduction is surprising close to that calculated for the first option. The only negative result observed is a slight increase in the intermediate inlet sodium temperatures by about 10 C. This temperature rise could presumably be precluded or significantly reduced through fine adjustment of the control rods and pump motors. The third option assumes that the reactor core power and primary and intermediate system flow rates are ideally reduced linearly in a programmed fashion that instantaneously matches the prescribed load demand. The calculated behavior of this idealized case reveals a number of difficulties because the control strategy for the S-CO{sub 2} cycle overcools the reactor potentially resulting in the calculation of sodium bulk freezing and the onset of sodium boiling. The results show that autonomous SFR operation may be viable for the particular assumed load change transient and deserves further investigation for other transients and postulated accidents.

  7. Investigation of the thermal response of a gasdynamic heater with helical impellers. Calspan Report No. 6961-A-1. [MAZE and TACO2D codes

    SciTech Connect

    Rae, W. J.

    1981-12-01

    A gasdynamic heater, capable of producing contamination-free gas streams at temperatures up to 9000/sup 0/K, is being developed by the Vulcan project. The design of a cooling system for the case parts and the associated thermal analysis are a critical part of a successful design. The purpose of the present work was to perform a preliminary cooling passage design and complete thermal analysis for the center body liner, end plate liners and exit nozzle. The approach envisioned for this work was the use of a set of LLNL finite-element codes, called MAZE and TACO2D. These were to be used at LLNL in a series of visits by the Calspan principal investigator. The project was cancelled shortly after the first of these visits; this report contains a summary of the work accomplished during the abbreviated contract period, and a review of the items that will need to be considered when the work is resumed at some future date.

  8. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  9. An investigation of the potential for the use of a high resolution adaptive coded aperture system in the mid-wave infrared

    NASA Astrophysics Data System (ADS)

    Slinger, Chris; Eismann, Michael; Gordon, Neil; Lewis, Keith; McDonald, Gregor; McNie, Mark; Payne, Doug; Ridley, Kevin; Strens, Malcolm; De Villiers, Geoff; Wilson, Rebecca

    2007-09-01

    Previous applications of coded aperture imaging (CAI) have been mainly in the energetic parts of the electro-magnetic spectrum, such as gamma ray astronomy, where few viable imaging alternatives exist. In addition, resolution requirements have typically been low (~ mrad). This paper investigates the prospects for and advantages of using CAI at longer wavelengths (visible, infrared) and at higher resolutions, and also considers the benefits of adaptive CAI techniques. The latter enable CAI to achieve reconfigurable modes of imaging, as well as improving system performance in other ways, such as enhanced image quality. It is shown that adaptive CAI has several potential advantages over more traditional optical systems for some applications in these wavebands. The merits include low mass, volume and moments of inertia, potentially lower costs, graceful failure modes, steerable fields of regard with no macroscopic moving parts and inherently encrypted data streams. Among the challenges associated with this new imaging approach are the effects of diffraction, interference, photon absorption at the mask and the low scene contrasts in the infrared wavebands. The paper analyzes some of these and presents the results of some of the tradeoffs in optical performance, using radiometric calculations to illustrate the consequences in a mid-infrared application. A CAI system requires a decoding algorithm in order to form an image and the paper discusses novel approaches, tailored to longer wavelength operation. The paper concludes by presenting initial experimental results.

  10. Evaluating Reanalysis - Independent Observations and Observation Independence

    NASA Astrophysics Data System (ADS)

    Wahl, S.; Bollmeyer, C.; Danek, C.; Friederichs, P.; Keller, J. D.; Ohlwein, C.

    2014-12-01

    Reanalyses on global to regional scales are widely used for validation of meteorological or hydrological models and for many climate applications. However, the evaluation of the reanalyses itself is still a crucial task. A major challenge is the lack of independent observations, since most of the available observational data is already included, e. g. by the data assimilation scheme. Here, we focus on the evaluation of dynamical reanalyses which are obtained by using numerical weather prediction models with a fixed data assimilation scheme. Precipitation is generally not assimilated in dynamical reanalyses (except for e.g. latent heat nudging) and thereby provides valuable data for the evaluation of reanalysis. Since precipitation results from the complex dynamical and microphysical atmospheric processes, an accurate representation of precipitation is often used as an indicator for a good model performance. Here, we use independent observations of daily precipitation accumulations from European rain gauges (E-OBS) of the years 2008 and 2009 for the intercomparison of various regional reanalyses products for the European CORDEX domain (Hirlam reanalysis at 0.2°, Metoffice UM reanalysis at 0.11°, COSMO reanalysis at 0.055°). This allows for assessing the benefits of increased horizontal resolution compared to global reanalyses. Furthermore, the effect of latent heat nudging (assimilation of radar-derived rain rates) is investigated using an experimental setup of the COSMO reanalysis with 6km and 2km resolution for summer 2011. Further, we present an observation independent evaluation based on kinetic energy spectra. Such spectra should follow a k-3 dependence of the wave number k for the larger scale, and a k-5/3 dependence on the mesoscale. We compare the spectra of the aforementioned regional reanalyses in order to investigate the general capability of the reanalyses to resolve events on the mesoscale (e.g. effective resolution). The intercomparison and evaluation of regional reanalyses is carried out by the climate monitoring branch of the Hans-Ertel-Centre for Weather Research.

  11. Comet assay in reconstructed 3D human epidermal skin models—investigation of intra- and inter-laboratory reproducibility with coded chemicals

    PubMed Central

    Pfuhler, Stefan

    2013-01-01

    Reconstructed 3D human epidermal skin models are being used increasingly for safety testing of chemicals. Based on EpiDerm™ tissues, an assay was developed in which the tissues were topically exposed to test chemicals for 3h followed by cell isolation and assessment of DNA damage using the comet assay. Inter-laboratory reproducibility of the 3D skin comet assay was initially demonstrated using two model genotoxic carcinogens, methyl methane sulfonate (MMS) and 4-nitroquinoline-n-oxide, and the results showed good concordance among three different laboratories and with in vivo data. In Phase 2 of the project, intra- and inter-laboratory reproducibility was investigated with five coded compounds with different genotoxicity liability tested at three different laboratories. For the genotoxic carcinogens MMS and N-ethyl-N-nitrosourea, all laboratories reported a dose-related and statistically significant increase (P < 0.05) in DNA damage in every experiment. For the genotoxic carcinogen, 2,4-diaminotoluene, the overall result from all laboratories showed a smaller, but significant genotoxic response (P < 0.05). For cyclohexanone (CHN) (non-genotoxic in vitro and in vivo, and non-carcinogenic), an increase compared to the solvent control acetone was observed only in one laboratory. However, the response was not dose related and CHN was judged negative overall, as was p-nitrophenol (p-NP) (genotoxic in vitro but not in vivo and non-carcinogenic), which was the only compound showing clear cytotoxic effects. For p-NP, significant DNA damage generally occurred only at doses that were substantially cytotoxic (>30% cell loss), and the overall response was comparable in all laboratories despite some differences in doses tested. The results of the collaborative study for the coded compounds were generally reproducible among the laboratories involved and intra-laboratory reproducibility was also good. These data indicate that the comet assay in EpiDerm™ skin models is a promising model for the safety assessment of compounds with a dermal route of exposure. PMID:24150594

  12. More box codes

    NASA Technical Reports Server (NTRS)

    Solomon, G.

    1992-01-01

    A new investigation shows that, starting from the BCH (21,15;3) code represented as a 7 x 3 matrix and adding a row and column to add even parity, one obtains an 8 x 4 matrix (32,15;8) code. An additional dimension is obtained by specifying odd parity on the rows and even parity on the columns, i.e., adjoining to the 8 x 4 matrix, the matrix, which is zero except for the fourth column (of all ones). Furthermore, any seven rows and three columns will form the BCH (21,15;3) code. This box code has the same weight structure as the quadratic residue and BCH codes of the same dimensions. Whether there exists an algebraic isomorphism to either code is as yet unknown.

  13. Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition

    NASA Technical Reports Server (NTRS)

    Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.

    2003-01-01

    The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

  14. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  15. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  16. An Evaluation of Two Different Methods of Assessing Independent Investigations in an Operational Pre-University Level Examination in Biology in England.

    ERIC Educational Resources Information Center

    Brown, Chris

    1998-01-01

    Explored aspects of assessment of extended investigation ("project") practiced in the operational examinations of The University of Cambridge Local Examinations Syndicate (UCLES) for the perspective of construct validity. Samples of the 1993 (n=333) and 1996 (n=259) biology test results reveal two methods of assessing the project. (MAK)

  17. A dynamic population model to investigate effects of climate and climate-independent factors on the lifecycle of the tick Amblyomma americanum

    USGS Publications Warehouse

    Ludwig, Antoinette; Ginsberg, Howard; Hickling, Graham J.; Ogden, Nicholas H.

    2015-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  18. A Dynamic Population Model to Investigate Effects of Climate and Climate-Independent Factors on the Lifecycle of Amblyomma americanum (Acari: Ixodidae).

    PubMed

    Ludwig, Antoinette; Ginsberg, Howard S; Hickling, Graham J; Ogden, Nicholas H

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick. PMID:26502753

  19. Phonological coding during reading.

    PubMed

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. PMID:25150679

  20. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  1. Statistical mediation analysis with a multicategorical independent variable.

    PubMed

    Hayes, Andrew F; Preacher, Kristopher J

    2014-11-01

    Virtually all discussions and applications of statistical mediation analysis have been based on the condition that the independent variable is dichotomous or continuous, even though investigators frequently are interested in testing mediation hypotheses involving a multicategorical independent variable (such as two or more experimental conditions relative to a control group). We provide a tutorial illustrating an approach to estimation of and inference about direct, indirect, and total effects in statistical mediation analysis with a multicategorical independent variable. The approach is mathematically equivalent to analysis of (co)variance and reproduces the observed and adjusted group means while also generating effects having simple interpretations. Supplementary material available online includes extensions to this approach and Mplus, SPSS, and SAS code that implements it. PMID:24188158

  2. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  3. Completeness of cancer registration in England and Wales: an assessment based on 2,145 patients with Hodgkin's disease independently registered by the British National Lymphoma Investigation.

    PubMed

    Swerdlow, A J; Douglas, A J; Vaughan Hudson, G; Vaughan Hudson, B

    1993-02-01

    Records of 2,145 cases of Hodgkin's disease in England and Wales treated by the British National Lymphoma Investigation during 1970-84 were sought in the national and regional cancer registers. One thousand eight hundred and eight-six (88%) were recorded in the national register, either as Hodgkin's disease (86%) or as other or unspecified lymphoma (2%) and 2 (0.1%) were recorded as other cancers. A further 69 (3%) cases were registered by regional cancer registries but had not reached the national register. Adjusting for the distribution of the study cases by region of incidence, we estimate completeness of registration of cases of Hodgkin's disease in the national register at 89.7%, and in the regional registers overall at 92.9%. Completeness did not vary appreciably by age or sex or calendar period. There was however, substantial variation in completeness between regional registries. Estimates were made for all regions except North Western; the lowest estimated completeness were under 90% in Wessex, and the Thames registry regions, and the greatest were 95% or more in Northern, Trent, East Anglia, Oxford, South Western, West Midlands and Mersey. Because these results are confined to one malignancy treated by a particular collaborative network of physicians (although a large and widespread one), and because the patients are restricted to those seen in hospitals, caution must be exercised in extrapolation of the findings to cancer registration generally, but other studies and sources of information lead to similar conclusions about completeness of cancer registration nationally and regionally. PMID:8431361

  4. How do we code the letters of a word when we have to write it? Investigating double letter representation in French.

    PubMed

    Kandel, Sonia; Peereman, Ronald; Ghimenton, Anna

    2014-05-01

    How do we code the letters of a word when we have to write it? We examined whether the orthographic representations that the writing system activates have a specific coding for letters when these are doubled in a word. French participants wrote words on a digitizer. The word pairs shared the initial letters and differed on the presence of a double letter (e.g., LISSER/LISTER). The results on latencies, letter and inter-letter interval durations revealed that L and I are slower to write when followed by a doublet (SS) than when not (ST). Doublet processing constitutes a supplementary cognitive load that delays word production. This suggests that word representations code letter identity and quantity separately. The data also revealed that the central processes that are involved in spelling representation cascade into the peripheral processes that regulate movement execution. PMID:24486807

  5. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation.

    PubMed

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-Ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of embryonic and subsequent growth in zebrafish. PMID:26820155

  6. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation

    PubMed Central

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of embryonic and subsequent growth in zebrafish. PMID:26820155

  7. Image coding.

    PubMed

    Kunt, M

    1988-01-01

    The digital representation of an image requires a very large number of bits. The goal of image coding is to reduce this number, as much as possible, and reconstruct a faithful duplicate of the original picture. Early efforts in image coding, solely guided by information theory, led to a plethora of methods. The compression ratio reached a saturation level around 10:1 a couple of years ago. Recent progress in the study of the brain mechanism of vision and scene analysis has opened new vistas in picture coding. Directional sensitivity of the neurones in the visual pathway combined with the separate processing of contours and textures has led to a new class of coding methods capable of achieving compression ratios as high as 100:1. PMID:3072645

  8. Getting Students to be Successful, Independent Investigators

    ERIC Educational Resources Information Center

    Thomas, Jeffrey D.

    2010-01-01

    Middle school students often struggle when writing testable problems, planning valid and reliable procedures, and drawing meaningful evidence-based conclusions. To address this issue, the author created a student-centered lab handout to facilitate the inquiry process for students. This handout has reduced students' frustration and helped them…

  9. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  10. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  11. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the objectives, meeting goals and overall NASA goals for the NASA Data Standards Working Group. The presentation includes information on the technical progress surrounding the objective, short LDPC codes, and the general results on the Pu-Pw tradeoff.

  12. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  13. Coordinated design of coding and modulation systems

    NASA Technical Reports Server (NTRS)

    Massey, J. L.; Ancheta, T.; Johannesson, R.; Lauer, G.; Lee, L.

    1976-01-01

    The joint optimization of the coding and modulation systems employed in telemetry systems was investigated. Emphasis was placed on formulating inner and outer coding standards used by the Goddard Spaceflight Center. Convolutional codes were found that are nearly optimum for use with Viterbi decoding in the inner coding of concatenated coding systems. A convolutional code, the unit-memory code, was discovered and is ideal for inner system usage because of its byte-oriented structure. Simulations of sequential decoding on the deep-space channel were carried out to compare directly various convolutional codes that are proposed for use in deep-space systems.

  14. Covariance Matrix Evaluations for Independent Mass Fission Yields

    SciTech Connect

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.

  15. Covariance Matrix Evaluations for Independent Mass Fission Yields

    NASA Astrophysics Data System (ADS)

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.; Sumini, M.

    2015-01-01

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of 235U(nth, f) and 239Pu(nth, f) reactions.

  16. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 4 2011-07-01 2011-07-01 false Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  17. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 4 2012-07-01 2011-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  18. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  19. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  20. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 4 2013-07-01 2013-07-01 false Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  1. Cyclic unequal error protection codes constructed from cyclic codes of composite length

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1987-01-01

    The distance structure of cyclic codes of composite length was investigated. A lower bound on the minimum distance for this class of codes is derived. In many cases, the lower bound gives the true minimum distance of a code. Then the distance structure of the direct sum of two cyclic codes of composite length were investigated. It was shown that, under certain conditions, the direct-sum code provides two levels of error correcting capability, and hence is a two-level unequal error protection (UEP) code. Finally, a class of two-level UEP cyclic direct-sum codes and a decoding algorithm for a subclass of these codes are presented.

  2. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    NASA Astrophysics Data System (ADS)

    Surkov, A. V.; Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-01

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of 235U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the 54Fe ( n, p) and 58Ni ( n, p) reactions. The 235U fission rate was measured using uranium dioxide with 10% enrichment in 235U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  3. Film Festivals: A First Step for Independents.

    ERIC Educational Resources Information Center

    Manning, Nick

    In order for filmmaking to be a true art form, the filmmaker needs to be free both to conceive and realize a personal vision and to remain independent of rating codes, length prescriptions, the market, sterile formats, and other imposed limitations. Moreover, if noncommercial films are to succeed in the next decade, a respectful audience must be…

  4. Development of safety incident coding systems through improving coding reliability.

    PubMed

    Olsen, Nikki S; Williamson, Ann M

    2015-11-01

    This paper reviews classification theory sources to develop five research questions concerning factors associated with incident coding system development and use and how these factors affect coding reliability. Firstly, a method was developed to enable the comparison of reliability results obtained using different methods. Second, a statistical and qualitative review of reliability studies was conducted to investigate the influence of the identified factors on the reliability of incident coding systems. As a result several factors were found to have a statistically significant effect on reliability. Four recommendations for system development and use are provided to assist researchers in improving the reliability of incident coding systems in high hazard industries. PMID:26154213

  5. Nevada Nuclear Waste Storage Investigations Project: Unit evaluation at Yucca Mountain, Nevada Test Site: Near-field thermal and mechanical calculations using the SANDIA-ADINA code

    SciTech Connect

    Johnson, R.L.; Bauer, S.J.

    1987-05-01

    Presented in this report are the results of a comparative study of two candidate horizons, the welded, devitrified Topopah Spring Member ofthe Paintbrush Tuff, and the nonwelded, zeolitized Tuffaceous Beds of Calico Hills. The mechanical and thermomechanical response these two horizons was assessed by conducting thermal and thermomechanical calculations using a two-dimensional room and pillar geometry of the vertical waste emplacement option using average and limit properties for each. A modified version of the computer code ADINA (SANDIA-ADINA) containing a material model for rock masses with ubiquitous jointing was used in the calculations. Results of the calculations are presented as the units` capacity for storage of nuclear waste and stability of the emplacement room and pillar due to excavation and long-term heating. A comparison is made with a similar underground opening geometry sited in Grouse Canyon Tuff, using properties obtained from G-Tunnel - a horizon of known excavation characteristics. Long-term stability of the excavated rooms was predicted for all units, as determined by evaluating regions of predicted joint slip as the result of excavation and subsequent thermal loading, evaluating regions of predicted rock matrix failure as the result of excavation and subsequent thermal loading, and evaluating safety factors against rock matrix failure. These results were derived through considering a wide range in material properties and in situ stresses. 21 refs., 21 figs., 5 tabs.

  6. Fuel management optimization using genetic algorithms and code independence

    SciTech Connect

    DeChaine, M.D.; Feltus, M.A.

    1994-12-31

    Fuel management optimization is a hard problem for traditional optimization techniques. Loading pattern optimization is a large combinatorial problem without analytical derivative information. Therefore, methods designed for continuous functions, such as linear programming, do not always work well. Genetic algorithms (GAs) address these problems and, therefore, appear ideal for fuel management optimization. They do not require derivative information and work well with combinatorial. functions. The GAs are a stochastic method based on concepts from biological genetics. They take a group of candidate solutions, called the population, and use selection, crossover, and mutation operators to create the next generation of better solutions. The selection operator is a {open_quotes}survival-of-the-fittest{close_quotes} operation and chooses the solutions for the next generation. The crossover operator is analogous to biological mating, where children inherit a mixture of traits from their parents, and the mutation operator makes small random changes to the solutions.

  7. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  8. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  9. Role of long non-coding RNA HULC in cell proliferation, apoptosis and tumor metastasis of gastric cancer: a clinical and in vitro investigation.

    PubMed

    Zhao, Yan; Guo, Qinhao; Chen, Jiejing; Hu, Jun; Wang, Shuwei; Sun, Yueming

    2014-01-01

    Long non-coding RNAs (lncRNAs) are emerging as key molecules in human cancer. Highly upregulated in liver cancer (HULC), an lncRNA, has recently been revealed to be involved in hepatocellular carcinoma development and progression. It remains unclear, however, whether HULC plays an oncogenic role in human gastric cancer (GC). In the present study, we demonstrated that HULC was significantly overexpressed in GC cell lines and GC tissues compared with normal controls, and this overexpression was correlated with lymph node metastasis, distant metastasis and advanced tumor node metastasis stages. In addition, a receiver operating characteristic (ROC) curve was constructed to evaluate the diagnostic values and the area under the ROC curve of HULC was up to 0.769. To uncover its functional importance, gain- and loss-of-function studies were performed to evaluate the effect of HULC on cell proliferation, apoptosis and invasion in vitro. Overexpression of HULC promoted proliferation and invasion and inhibited cell apoptosis in SGC7901 cells, while knockdown of HULC in SGC7901 cells showed the opposite effect. Mechanistically, we discovered that overexpression of HULC could induce patterns of autophagy in SGC7901 cells; more importantly, autophagy inhibition increased overexpression of HULC cell apoptosis. We also determined that silencing of HULC effectively reversed the epithelial-to-mesenchymal transition (EMT) phenotype. In summary, our results suggest that HULC may play an important role in the growth and tumorigenesis of human GC, which provides us with a new biomarker in GC and perhaps a potential target for GC prevention, diagnosis and therapeutic treatment. PMID:24247585

  10. Codes of split type

    NASA Astrophysics Data System (ADS)

    Kimizuka, Maro; Sasaki, Ryuji

    Generalizing a way to construct Golay codes, codes of split type are defined. A lot of interesting codes, for example, extremal codes of length n ? 40 such as Golay codes and binary doubly even self-dual codes [48, 24, 12], [72, 36, w] with w ? 12, are represented as codes of split type.

  11. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  12. Native Nations of Quebec: Independence within Independence?

    ERIC Educational Resources Information Center

    Williams, Paul

    1995-01-01

    Aboriginal nations oppose the separation of Quebec from Canada because they favor confederations, multiple international boundaries present jurisdictional nightmares, federal programs might disappear, and Quebec's history of aggression against Aboriginal peoples plus the ethnic nature of its nationalism suggest an independent Quebec is a potential…

  13. CONTAIN independent peer review

    SciTech Connect

    Boyack, B.E.; Corradini, M.L.; Denning, R.S.; Khatib-Rahbar, M.; Loyalka, S.K.; Smith, P.N.

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  14. On multilevel block modulation codes

    NASA Technical Reports Server (NTRS)

    Kasami, Tadao; Takata, Toyoo; Fujiwara, Toru; Lin, Shu

    1991-01-01

    The multilevel (ML) technique for combining block coding and modulation is investigated. A general formulation is presented for ML modulation codes in terms of component codes with appropriate distance measures. A specific method for constructing ML block modulation codes (MLBMCs) with interdependency among component codes is proposed. Given an MLBMC C with no interdependency among the binary component codes, the proposed method gives an MLBC C-prime that has the same rate as C, a minimum squared Euclidean distance not less than that of C, a trellis diagram with the same number of states as that of C, and a smaller number of nearest-neighbor codewords than that of C. Finally, a technique is presented for analyzing the error performance of MLBMCs for an additive white Gaussian noise channel based on soft-decision maximum-likelihood decoding.

  15. The Integrated TIGER Series Codes

    SciTech Connect

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  16. The Integrated TIGER Series Codes

    Energy Science and Technology Software Center (ESTSC)

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with anmore » input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  17. Codes with special correlation.

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.

    1964-01-01

    Uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets

  18. On the error probability of general tree and trellis codes with applications to sequential decoding

    NASA Technical Reports Server (NTRS)

    Johannesson, R.

    1973-01-01

    An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random binary tree codes is derived and shown to be independent of the length of the tree. An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random L-branch binary trellis codes of rate R = 1/n is derived which separates the effects of the tail length T and the memory length M of the code. It is shown that the bound is independent of the length L of the information sequence. This implication is investigated by computer simulations of sequential decoding utilizing the stack algorithm. These simulations confirm the implication and further suggest an empirical formula for the true undetected decoding error probability with sequential decoding.

  19. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2012-04-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

  20. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2011-10-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

  1. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  2. Heuristic dynamic complexity coding

    NASA Astrophysics Data System (ADS)

    Škorupa, Jozef; Slowack, Jürgen; Mys, Stefaan; Lambert, Peter; Van de Walle, Rik

    2008-04-01

    Distributed video coding is a new video coding paradigm that shifts the computational intensive motion estimation from encoder to decoder. This results in a lightweight encoder and a complex decoder, as opposed to the predictive video coding scheme (e.g., MPEG-X and H.26X) with a complex encoder and a lightweight decoder. Both schemas, however, do not have the ability to adapt to varying complexity constraints imposed by encoder and decoder, which is an essential ability for applications targeting a wide range of devices with different complexity constraints or applications with temporary variable complexity constraints. Moreover, the effect of complexity adaptation on the overall compression performance is of great importance and has not yet been investigated. To address this need, we have developed a video coding system with the possibility to adapt itself to complexity constraints by dynamically sharing the motion estimation computations between both components. On this system we have studied the effect of the complexity distribution on the compression performance. This paper describes how motion estimation can be shared using heuristic dynamic complexity and how distribution of complexity affects the overall compression performance of the system. The results show that the complexity can indeed be shared between encoder and decoder in an efficient way at acceptable rate-distortion performance.

  3. Production code control system for hydrodynamics simulations

    SciTech Connect

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration management system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.

  4. Longwave infrared (LWIR) coded aperture dispersive spectrometer.

    PubMed

    Fernandez, C; Guenther, B D; Gehm, M E; Brady, D J; Sullivan, M E

    2007-04-30

    We describe a static aperture-coded, dispersive longwave infrared (LWIR) spectrometer that uses a microbolometer array at the detector plane. The two-dimensional aperture code is based on a row-doubled Hadamard mask with transmissive and opaque openings. The independent column code nature of the matrix makes for a mathematically well-defined pattern that spatially and spectrally maps the source information to the detector plane. Post-processing techniques on the data provide spectral estimates of the source. Comparative experimental results between a slit and coded aperture for emission spectroscopy from a CO(2) laser are demonstrated. PMID:19532832

  5. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E. C.

    1986-01-01

    The analysis of the rotational dynamics of the satellite was focused on the rotational amplitude increase of the satellite, with respect to the tether, during retrieval. The dependence of the rotational amplitude upon the tether tension variation to the power 1/4 was thoroughly investigated. The damping of rotational oscillations achievable by reel control was also quantified while an alternative solution that makes use of a lever arm attached with a universal joint to the satellite was proposed. Comparison simulations between the Smithsonian Astrophysical Observatory and the Martin Marietta (MMA) computer code of reteival maneuvers were also carried out. The agreement between the two, completely independent, codes was extremely close, demonstrating the reliability of the models. The slack tether dynamics during reel jams was analytically investigated in order to identify the limits of applicability of the SLACK3 computer code to this particular case. Test runs with SLACK3 were also carried out.

  6. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. PMID:25894105

  7. The Independence of Reduced Subgroup-State

    NASA Astrophysics Data System (ADS)

    Luo, Ming-Xing; Deng, Yun

    2014-09-01

    Quantum hidden problem being one of the most important quantum computation problems has been widely investigated. Our purpose in this paper is to prove the independent or partial independent of the reduced state derived from the quantum query with the oracle implementation. We prove that if without bias on implementation functions the subgroup state is independent of evaluation functions using the group representation. This result is also used to improve the quantum query success probability.

  8. Video multicast using network coding

    NASA Astrophysics Data System (ADS)

    Ramasubramonian, Adarsh K.; Woods, John W.

    2009-01-01

    We investigate the problem of video multicast in lossy networks using network coding and multiple description codes. The rate allocation for multiple descriptions can be optimized at the source to generate a scalable video bitstream such that the expected PSNR of the video at the receivers is maximized. We show that using network coding with multiple description codes can significantly improve the quality of video obtained at the receiver, in comparison to routing (with or without replication). Simulations show that as the loss rate increases, the improvement in the performance increases and in certain cases for loss rate of 0.20, the improvement can be as high as 3 to 3.5 dB when compared to routing with replication. Moreover, network coding obviates the need to construct multiple multicast trees for transmission, which is necessary in routing with replication.

  9. Data Machine Independence

    Energy Science and Technology Software Center (ESTSC)

    1994-12-30

    Data-machine independence achieved by using four technologies (ASN.1, XDR, SDS, and ZEBRA) has been evaluated by encoding two different applications in each of the above; and their results compared against the standard programming method using C.

  10. Energy efficient rateless codes for high speed data transfer over free space optical channels

    NASA Astrophysics Data System (ADS)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  11. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  12. Exceptional error minimization in putative primordial genetic codes

    PubMed Central

    2009-01-01

    Background The standard genetic code is redundant and has a highly non-random structure. Codons for the same amino acids typically differ only by the nucleotide in the third position, whereas similar amino acids are encoded, mostly, by codon series that differ by a single base substitution in the third or the first position. As a result, the code is highly albeit not optimally robust to errors of translation, a property that has been interpreted either as a product of selection directed at the minimization of errors or as a non-adaptive by-product of evolution of the code driven by other forces. Results We investigated the error-minimization properties of putative primordial codes that consisted of 16 supercodons, with the third base being completely redundant, using a previously derived cost function and the error minimization percentage as the measure of a code's robustness to mistranslation. It is shown that, when the 16-supercodon table is populated with 10 putative primordial amino acids, inferred from the results of abiotic synthesis experiments and other evidence independent of the code's evolution, and with minimal assumptions used to assign the remaining supercodons, the resulting 2-letter codes are nearly optimal in terms of the error minimization level. Conclusion The results of the computational experiments with putative primordial genetic codes that contained only two meaningful letters in all codons and encoded 10 to 16 amino acids indicate that such codes are likely to have been nearly optimal with respect to the minimization of translation errors. This near-optimality could be the outcome of extensive early selection during the co-evolution of the code with the primordial, error-prone translation system, or a result of a unique, accidental event. Under this hypothesis, the subsequent expansion of the code resulted in a decrease of the error minimization level that became sustainable owing to the evolution of a high-fidelity translation system. Reviewers This article was reviewed by Paul Higgs (nominated by Arcady Mushegian), Rob Knight, and Sandor Pongor. For the complete reports, go to the Reviewers' Reports section. PMID:19925661

  13. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  14. Independent Replication and Meta-Analysis for Endometriosis Risk Loci.

    PubMed

    Sapkota, Yadav; Fassbender, Amelie; Bowdler, Lisa; Fung, Jenny N; Peterse, Daniëlle; O, Dorien; Montgomery, Grant W; Nyholt, Dale R; D'Hooghe, Thomas M

    2015-10-01

    Endometriosis is a complex disease that affects 6-10% of women in their reproductive years and 20-50% of women with infertility. Genome-wide and candidate-gene association studies for endometriosis have identified 10 independent risk loci, and of these, nine (rs7521902, rs13394619, rs4141819, rs6542095, rs1519761, rs7739264, rs12700667, rs1537377, and rs10859871) are polymorphic in European populations. Here we investigate the replication of nine SNP loci in 998 laparoscopically and histologically confirmed endometriosis cases and 783 disease-free controls from Belgium. SNPs rs7521902, rs13394619, and rs6542095 show nominally significant (p < .05) associations with endometriosis, while the directions of effect for seven SNPs are consistent with the original reports. Association of rs6542095 at the IL1A locus with 'All' (p = .066) and 'Grade_B' (p = .01) endometriosis is noteworthy because this is the first successful replication in an independent population. Meta-analysis with the published results yields genome-wide significant evidence for rs7521902, rs13394619, rs6542095, rs12700667, rs7739264, and rs1537377. Notably, three coding variants in GREB1 (near rs13394619) and CDKN2B-AS1 (near rs1537377) also showed nominally significant associations with endometriosis. Overall, this study provides important replication in a uniquely characterized independent population, and indicates that the majority of the original genome-wide association findings are not due to chance alone. PMID:26337243

  15. Phylogeny of genetic codes and punctuation codes within genetic codes.

    PubMed

    Seligmann, Hervé

    2015-03-01

    Punctuation codons (starts, stops) delimit genes, reflect translation apparatus properties. Most codon reassignments involve punctuation. Here two complementary approaches classify natural genetic codes: (A) properties of amino acids assigned to codons (classical phylogeny), coding stops as X (A1, antitermination/suppressor tRNAs insert unknown residues), or as gaps (A2, no translation, classical stop); and (B) considering only punctuation status (start, stop and other codons coded as -1, 0 and 1 (B1); 0, -1 and 1 (B2, reflects ribosomal translational dynamics); and 1, -1, and 0 (B3, starts/stops as opposites)). All methods separate most mitochondrial codes from most nuclear codes; Gracilibacteria consistently cluster with metazoan mitochondria; mitochondria co-hosted with chloroplasts cluster with nuclear codes. Method A1 clusters the euplotid nuclear code with metazoan mitochondria; A2 separates euplotids from mitochondria. Firmicute bacteria Mycoplasma/Spiroplasma and Protozoan (and lower metazoan) mitochondria share codon-amino acid assignments. A1 clusters them with mitochondria, they cluster with the standard genetic code under A2: constraints on amino acid ambiguity versus punctuation-signaling produced the mitochondrial versus bacterial versions of this genetic code. Punctuation analysis B2 converges best with classical phylogenetic analyses, stressing the need for a unified theory of genetic code punctuation accounting for ribosomal constraints. PMID:25600501

  16. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  17. Dual-code quantum computation model

    NASA Astrophysics Data System (ADS)

    Choi, Byung-Soo

    2015-08-01

    In this work, we propose the dual-code quantum computation model—a fault-tolerant quantum computation scheme which alternates between two different quantum error-correction codes. Since the chosen two codes have different sets of transversal gates, we can implement a universal set of gates transversally, thereby reducing the overall cost. We use code teleportation to convert between quantum states in different codes. The overall cost is decreased if code teleportation requires fewer resources than the fault-tolerant implementation of the non-transversal gate in a specific code. To analyze the cost reduction, we investigate two cases with different base codes, namely the Steane and Bacon-Shor codes. For the Steane code, neither the proposed dual-code model nor another variation of it achieves any cost reduction since the conventional approach is simple. For the Bacon-Shor code, the three proposed variations of the dual-code model reduce the overall cost. However, as the encoding level increases, the cost reduction decreases and becomes negative. Therefore, the proposed dual-code model is advantageous only when the encoding level is low and the cost of the non-transversal gate is relatively high.

  18. Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code

    NASA Technical Reports Server (NTRS)

    Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.

    2003-01-01

    Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

  19. Correlated algebraic-geometric codes

    NASA Astrophysics Data System (ADS)

    Guruswami, Venkatesan; Patthak, Anindya C.

    2008-03-01

    We define a new family of error-correcting codes based on algebraic curves over finite fields, and develop efficient list decoding algorithms for them. Our codes extend the class of algebraic-geometric (AG) codes via a (nonobvious) generalization of the approach in the recent breakthrough work of Parvaresh and Vardy (2005). Our work shows that the PV framework applies to fairly general settings by elucidating the key algebraic concepts underlying it. Also, more importantly, AG codes of arbitrary block length exist over fixed alphabets Sigma , thus enabling us to establish new trade-offs between the list decoding radius and rate over a bounded alphabet size. The work of Parvaresh and Vardy (2005) was extended in Guruswami and Rudra (2006) to give explicit codes that achieve the list decoding capacity (optimal trade-off between rate and fraction of errors corrected) over large alphabets. A similar extension of this work along the lines of Guruswami and Rudra could have substantial impact. Indeed, it could give better trade-offs than currently known over a fixed alphabet (say, GF(2^{12}) ), which in turn, upon concatenation with a fixed, well-understood binary code, could take us closer to the list decoding capacity for binary codes. This may also be a promising way to address the significant complexity drawback of the result of Guruswami and Rudra, and to enable approaching capacity with bounded list size independent of the block length (the list size and decoding complexity in their work are both n^{Omega(1/\\varepsilon)} where \\varepsilon is the distance to capacity). Similar to algorithms for AG codes from Guruswami and Sudan (1999) and (2001), our encoding/decoding algorithms run in polynomial time assuming a natural polynomial-size representation of the code. For codes based on a specific ``optimal'' algebraic curve, we also present an expected polynomial time algorithm to construct the requisite representation. This in turn fills an important void in the literature by presenting an efficient construction of the representation often assumed in the list decoding algorithms for AG codes.

  20. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  1. V(D)J recombination coding junction formation without DNA homology: processing of coding termini.

    PubMed Central

    Boubnov, N V; Wills, Z P; Weaver, D T

    1993-01-01

    Coding junction formation in V(D)J recombination generates diversity in the antigen recognition structures of immunoglobulin and T-cell receptor molecules by combining processes of deletion of terminal coding sequences and addition of nucleotides prior to joining. We have examined the role of coding end DNA composition in junction formation with plasmid substrates containing defined homopolymers flanking the recombination signal sequence elements. We found that coding junctions formed efficiently with or without terminal DNA homology. The extent of junctional deletion was conserved independent of coding ends with increased, partial, or no DNA homology. Interestingly, G/C homopolymer coding ends showed reduced deletion regardless of DNA homology. Therefore, DNA homology cannot be the primary determinant that stabilizes coding end structures for processing and joining. PMID:8413286

  2. Distributed single source coding with side information

    NASA Astrophysics Data System (ADS)

    Vila-Forcen, Jose E.; Koval, Oleksiy; Voloshynovskiy, Sviatoslav V.

    2004-01-01

    In the paper we advocate image compression technique in the scope of distributed source coding framework. The novelty of the proposed approach is twofold: classical image compression is considered from the positions of source coding with side information and, contrarily to the existing scenarios, where side information is given explicitly, side information is created based on deterministic approximation of local image features. We consider an image in the transform domain as a realization of a source with a bounded codebook of symbols where each symbol represents a particular edge shape. The codebook is image independent and plays the role of auxiliary source. Due to the partial availability of side information at both encoder and decoder we treat our problem as a modification of Berger-Flynn-Gray problem and investigate a possible gain over the solutions when side information is either unavailable or available only at decoder. Finally, we present a practical compression algorithm for passport photo images based on our concept that demonstrates the superior performance in very low bit rate regime.

  3. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  4. Independent Video in Britain.

    ERIC Educational Resources Information Center

    Stewart, David

    Maintaining the status quo as well as the attitude toward cultural funding and development that it imposes on video are detrimental to the formation of a thriving video network, and also out of key with the present social and political situation in Britain. Independent video has some quite specific advantages as a medium for cultural production…

  5. Independent Living Course

    ERIC Educational Resources Information Center

    Tipping, Joyce

    1978-01-01

    Designed to help handicapped persons who have been living a sheltered existence develop independent living skills, this course is divided into two parts. The first part consists of a five-day apartment live-in experience, and the second concentrates on developing the learners' awareness of community resources and consumer skills. (BM)

  6. Independence, Disengagement, and Discipline

    ERIC Educational Resources Information Center

    Rubin, Ron

    2012-01-01

    School disengagement is linked to a lack of opportunities for students to fulfill their needs for independence and self-determination. Young people have little say about what, when, where, and how they will learn, the criteria used to assess their success, and the content of school and classroom rules. Traditional behavior management discourages…

  7. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  8. An introduction to QR Codes: linking libraries and mobile patrons.

    PubMed

    Hoy, Matthew B

    2011-01-01

    QR codes, or "Quick Response" codes, are two-dimensional barcodes that can be scanned by mobile smartphone cameras. These codes can be used to provide fast access to URLs, telephone numbers, and short passages of text. With the rapid adoption of smartphones, librarians are able to use QR codes to promote services and help library users find materials quickly and independently. This article will explain what QR codes are, discuss how they can be used in the library, and describe issues surrounding their use. A list of resources for generating and scanning QR codes is also provided. PMID:21800986

  9. Circular codes revisited: a statistical approach.

    PubMed

    Gonzalez, D L; Giannerini, S; Rosa, R

    2011-04-21

    In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. PMID:21277862

  10. Molecular cloning of canine co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) and investigation of its ability to suppress androgen receptor signalling in androgen-independent prostate cancer.

    PubMed

    Kato, Yuiko; Ochiai, Kazuhiko; Michishita, Masaki; Azakami, Daigo; Nakahira, Rei; Morimatsu, Masami; Ishiguro-Oonuma, Toshina; Yoshikawa, Yasunaga; Kobayashi, Masato; Bonkobara, Makoto; Kobayashi, Masanori; Takahashi, Kimimasa; Watanabe, Masami; Omi, Toshinori

    2015-11-01

    Although the morbidity of canine prostate cancer is low, the majority of cases present with resistance to androgen therapy and poor clinical outcomes. These pathological conditions are similar to the signs of the terminal stage of human androgen-independent prostate cancer. The co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) is known to be overexpressed in human androgen-independent prostate cancer. However, there is little information about the structure and function of canine SGTA. In this study, canine SGTA was cloned and analysed for its ability to suppress androgen receptor signalling. The full-length open reading frame (ORF) of the canine SGTA gene was amplified by RT-PCR using primers designed from canine-expressed sequence tags that were homologous to human SGTA. The canine SGTA ORF has high homology with the corresponding human (89%) and mouse (81%) sequences. SGTA dimerisation region and tetratricopeptide repeat (TPR) domains are conserved across the three species. The ability of canine SGTA to undergo homodimerisation was demonstrated by a mammalian two-hybrid system and a pull-down assay. The negative impact of canine SGTA on androgen receptor (AR) signalling was demonstrated using a reporter assay in androgen-independent human prostate cancer cell lines. Pathological analysis showed overexpression of SGTA in canine prostate cancer, but not in hyperplasia. A reporter assay in prostate cells demonstrated suppression of AR signalling by canine SGTA. Altogether, these results suggest that canine SGTA may play an important role in the acquisition of androgen independence by canine prostate cancer cells. PMID:26346258

  11. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  12. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  13. Efficient codes and balanced networks.

    PubMed

    Denève, Sophie; Machens, Christian K

    2016-02-23

    Recent years have seen a growing interest in inhibitory interneurons and their circuits. A striking property of cortical inhibition is how tightly it balances excitation. Inhibitory currents not only match excitatory currents on average, but track them on a millisecond time scale, whether they are caused by external stimuli or spontaneous fluctuations. We review, together with experimental evidence, recent theoretical approaches that investigate the advantages of such tight balance for coding and computation. These studies suggest a possible revision of the dominant view that neurons represent information with firing rates corrupted by Poisson noise. Instead, tight excitatory/inhibitory balance may be a signature of a highly cooperative code, orders of magnitude more precise than a Poisson rate code. Moreover, tight balance may provide a template that allows cortical neurons to construct high-dimensional population codes and learn complex functions of their inputs. PMID:26906504

  14. Neuronal Adaptation Translates Stimulus Gaps into a Population Code

    PubMed Central

    Yuan, Chun-Wei; Khouri, Leila; Grothe, Benedikt; Leibold, Christian

    2014-01-01

    Neurons in sensory pathways exhibit a vast multitude of adaptation behaviors, which are assumed to aid the encoding of temporal stimulus features and provide the basis for a population code in higher brain areas. Here we study the transition to a population code for auditory gap stimuli both in neurophysiological recordings and in a computational network model. Independent component analysis (ICA) of experimental data from the inferior colliculus of Mongolian gerbils reveals that the network encodes different gap sizes primarily with its population firing rate within 30 ms after the presentation of the gap, where longer gap size evokes higher network activity. We then developed a computational model to investigate possible mechanisms of how to generate the population code for gaps. Phenomenological (ICA) and functional (discrimination performance) analyses of our simulated networks show that the experimentally observed patterns may result from heterogeneous adaptation, where adaptation provides gap detection at the single neuron level and neuronal heterogeneity ensures discriminable population codes for the whole range of gap sizes in the input. Furthermore, our work suggests that network recurrence additionally enhances the network's ability to provide discriminable population patterns. PMID:24759970

  15. Medical imaging with coded apertures

    SciTech Connect

    Keto, E.; Libby, S.

    1995-06-16

    Now algorithms were investigated for image reconstruction in emission tomography which could incorporate complex instrumental effects such as might be obtained with a coded aperture system. The investigation focused on possible uses of the wavelet transform to handle non-stationary instrumental effects and analytic continuation of the Radon transform to handle self-absorption. Neither investigation was completed during the funding period and whether such algorithms will be useful remains an open question.

  16. Cyclic unequal error protection codes constructed from cyclic codes of composite length

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Shu

    1988-01-01

    The unequal error correction capabilities of binary cyclic codes of composite length are investigated. Under certain conditions, direct sums of concatenated codes have unequal error correction capabilities. By a modified Hartmann and Tzeng algorithm, it is shown that a binary cyclic code of composite length is equivalent to the direct sum of concatenated codes. With this, some binary cyclic unequal error protection (UEP) codes are constructed. Finally, two-level UEP cyclic direct-sum codes are presented which provide error correction capabilities higher than those guaranteed by the Blokh-Zyablov constructions.

  17. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    ERIC Educational Resources Information Center

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  18. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  19. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  20. Bar Codes for Libraries.

    ERIC Educational Resources Information Center

    Rahn, Erwin

    1984-01-01

    Discusses the evolution of standards for bar codes (series of printed lines and spaces that represent numbers, symbols, and/or letters of alphabet) and describes the two types most frequently adopted by libraries--Code-A-Bar and CODE 39. Format of the codes is illustrated. Six references and definitions of terminology are appended. (EJS)

  1. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  2. International exploration by independent

    SciTech Connect

    Bertragne, R.G.

    1992-04-01

    Recent industry trends indicate that the smaller U.S. independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. Foreign finding costs per barrel usually are accepted to be substantially lower than domestic costs because of the large reserve potential of international plays. To get involved in overseas exploration, however, requires the explorationist to adapt to different cultural, financial, legal, operational, and political conditions. Generally, foreign exploration proceeds at a slower pace than domestic exploration because concessions are granted by a country's government, or are explored in partnership with a national oil company. First, the explorationist must prepare a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company; next, is an ongoing evaluation of quality prospects in various sedimentary basins, and careful planning and conduct of the operations. To successfully explore overseas also requires the presence of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work. Ideally, these team members will have had a considerable amount of on-site experience in various countries and climates. Independents best suited for foreign expansion are those who have been financially successful in domestic exploration. When properly approached, foreign exploration is well within the reach of smaller U.S. independents, and presents essentially no greater risk than domestic exploration; however, the reward can be much larger and can catapult the company into the 'big leagues.'

  3. International exploration by independents

    SciTech Connect

    Bertagne, R.G. )

    1991-03-01

    Recent industry trends indicate that the smaller US independents are looking at foreign exploration opportunities as one of the alternatives for growth in the new age of exploration. It is usually accepted that foreign finding costs per barrel are substantially lower than domestic because of the large reserve potential of international plays. To get involved overseas requires, however, an adaptation to different cultural, financial, legal, operational, and political conditions. Generally foreign exploration proceeds at a slower pace than domestic because concessions are granted by the government, or are explored in partnership with the national oil company. First, a mid- to long-term strategy, tailored to the goals and the financial capabilities of the company, must be prepared; it must be followed by an ongoing evaluation of quality prospects in various sedimentary basins, and a careful planning and conduct of the operations. To successfully explore overseas also requires the presence on the team of a minimum number of explorationists and engineers thoroughly familiar with the various exploratory and operational aspects of foreign work, having had a considerable amount of onsite experience in various geographical and climatic environments. Independents that are best suited for foreign expansion are those that have been financially successful domestically, and have a good discovery track record. When properly approached foreign exploration is well within the reach of smaller US independents and presents essentially no greater risk than domestic exploration; the reward, however, can be much larger and can catapult the company into the big leagues.

  4. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  5. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  6. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm itself is generic, and parametrized with respect to a library of coding patterns that depend on the safety policies and the code generator. The patterns characterize the notions of definitions and uses that are specific to the given safety property. For example, for initialization safety, definitions correspond to variable initializations while uses are statements which read a variable, whereas for array bounds safety, definitions are the array declarations, while uses are statements which access an array variable. The inferred annotations are thus highly dependent on the actual program and the properties being proven. The annotations, themselves, need not be trusted, but are crucial to obtain the automatic formal verification of the safety properties without requiring access to the internals of the code generator. The approach has been applied to both in-house and commercial code generators, but is independent of the particular generator used. It is currently being adapted to flight code generated using MathWorks Real-Time Workshop, an automatic code generator that translates from Simulink/Stateflow models into embedded C code.

  7. Subsystem codes with spatially local generators

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    2011-01-01

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size L×L with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d2=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd2=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  8. Subsystem codes with spatially local generators

    SciTech Connect

    Bravyi, Sergey

    2011-01-15

    We study subsystem codes whose gauge group has local generators in two-dimensional (2D) geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest-neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way, we introduce and study properties of generalized Bacon-Shor codes that might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d{sup 2}=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd{sup 2}=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  9. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  10. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  11. Photoacoustic generation using coded excitation

    NASA Astrophysics Data System (ADS)

    Su, Shin-Yuan; Li, Pai-Chi

    2011-03-01

    Photoacoustic (PA) imaging has been used to image soft tissue due to its high contrast and high spatial resolution. The generation of PA signal is based on the object's absorption characteristic to the emitted electromagnetic energy. Typically, a Q-switched Nd:YAG laser providing mJ pulse energy is suitable for biomedical PA applications. However, such laser is relatively bulky and expensive. An alternative way is to use a diode laser. A diode laser can generate laser pulse at much higher pulse repetition frequency (PRF). However, the output power of the diode laser is too low for effective PA generation. One method to overcome this problem is to increase the transmission energy using coded excitation. The coded laser signals can be transmitted by a diode laser with high PRF and the signal intensity of the received signal can be enhanced using pulse compression. In this study, we proposed a chirp coded excitation algorithm for a diode laser. Compared to Golay coded excitation seen in the literature, the proposed chirp coded excitation requires only a single transmission. Chirp-coded PA signal was generated by tuning the pulse duration of individual laser pulses in time domain. Result shows that the PA signal intensity can be enhanced after matched filtering. However, high range side-lobes are still present. The compression filter is an important tool to reduce the range side-lobes, which is subject to further investigation.

  12. Genetic code, hamming distance and stochastic matrices.

    PubMed

    He, Matthew X; Petoukhov, Sergei V; Ricci, Paolo E

    2004-09-01

    In this paper we use the Gray code representation of the genetic code C=00, U=10, G=11 and A=01 (C pairs with G, A pairs with U) to generate a sequence of genetic code-based matrices. In connection with these code-based matrices, we use the Hamming distance to generate a sequence of numerical matrices. We then further investigate the properties of the numerical matrices and show that they are doubly stochastic and symmetric. We determine the frequency distributions of the Hamming distances, building blocks of the matrices, decomposition and iterations of matrices. We present an explicit decomposition formula for the genetic code-based matrix in terms of permutation matrices, which provides a hypercube representation of the genetic code. It is also observed that there is a Hamiltonian cycle in a genetic code-based hypercube. PMID:15294430

  13. Independent power tax strategies

    SciTech Connect

    Smotkin, M.L.; Massie, J.M.

    1994-09-01

    As project opportunities become more competitive in the independent power industry, companies need to be more aggressive in the bidding process for new projects. Project owners are traditionally reluctant to lower their rate of return, but there are other ways to remain competitive. For most projects, advance planning for the timing of tax depreciation and the amortization of start-up costs can have a significant effect on a project`s rate of return and ultimately can be the difference between a successful project and a lost bid.

  14. Independent component analysis of parameterized ECG signals.

    PubMed

    Tanskanen, Jarno M A; Viik, Jari J; Hyttinen, Jari A K

    2006-01-01

    Independent component analysis (ICA) of measured signals yields the independent sources, given certain fulfilled requirements. Properly parameterized signals provide a better view to the considered system aspects, while reducing the amount of data. It is little acknowledged that appropriately parameterized signals may be subjected to ICA, yielding independent components (ICs) displaying more clearly the investigated properties of the sources. In this paper, we propose ICA of parameterized signals, and demonstrate the concept with ICA of ST and R parameterizations of electrocardiogram (ECG) signals from ECG exercise test measurements from two coronary artery disease (CAD) patients. PMID:17945912

  15. Cary Potter on Independent Education

    ERIC Educational Resources Information Center

    Potter, Cary

    1978-01-01

    Cary Potter was President of the National Association of Independent Schools from 1964-1978. As he leaves NAIS he gives his views on education, on independence, on the independent school, on public responsibility, on choice in a free society, on educational change, and on the need for collective action by independent schools. (Author/RK)

  16. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2015-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  17. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2014-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  18. Asymmetric quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    La Guardia, Giuliano G.

    2016-01-01

    In this paper, we construct the first families of asymmetric quantum convolutional codes (AQCCs). These new AQCCs are constructed by means of the CSS-type construction applied to suitable families of classical convolutional codes, which are also constructed here. The new codes have non-catastrophic generator matrices, and they have great asymmetry. Since our constructions are performed algebraically, i.e. we develop general algebraic methods and properties to perform the constructions, it is possible to derive several families of such codes and not only codes with specific parameters. Additionally, several different types of such codes are obtained.

  19. Independent task Fourier filters

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John

    2001-11-01

    Since the early 1960s, a major part of optical computing systems has been Fourier pattern recognition, which takes advantage of high speed filter changes to enable powerful nonlinear discrimination in `real time.' Because filter has a task quite independent of the tasks of the other filters, they can be applied and evaluated in parallel or, in a simple approach I describe, in sequence very rapidly. Thus I use the name ITFF (independent task Fourier filter). These filters can also break very complex discrimination tasks into easily handled parts, so the wonderful space invariance properties of Fourier filtering need not be sacrificed to achieve high discrimination and good generalizability even for ultracomplex discrimination problems. The training procedure proceeds sequentially, as the task for a given filter is defined a posteriori by declaring it to be the discrimination of particular members of set A from all members of set B with sufficient margin. That is, we set the threshold to achieve the desired margin and note the A members discriminated by that threshold. Discriminating those A members from all members of B becomes the task of that filter. Those A members are then removed from the set A, so no other filter will be asked to perform that already accomplished task.

  20. Multispectral photoacoustic coded excitation using pseudorandom codes

    NASA Astrophysics Data System (ADS)

    Beckmann, Martin F.; Friedrich, Claus-Stefan; Mienkina, Martin P.; Gerhardt, Nils C.; Hofmann, Martin R.; Schmitz, Georg

    2012-02-01

    Photoacoustic imaging (PAI) combines high ultrasound resolution with optical contrast. Laser-generated ultrasound is potentially beneficial for cancer detection, blood oxygenation imaging, and molecular imaging. PAI is generally performed using solid state Nd:YAG lasers in combination with optical parametric oscillators. An alternative approach uses laser diodes with higher pulse repetition rates but lower power. Thus, improvement in signal-to-noise ratio (SNR) is a key step towards applying laser diodes in PAI. To receive equivalent image quality using laser diodes as with Nd:YAG lasers, the lower power must be compensated by averaging, which can be enhanced through coded excitation. In principle, perfect binary sequences such as orthogonal Golay codes can be used for this purpose when acquiring data at multiple wavelengths. On the other hand it was shown for a single wavelength that sidelobes can remain invisible even if imperfect sequences are used. Moreover, SNR can be further improved by using an imperfect sequence compared to Golay codes. Here, we show that pseudorandom sequences are a good choice for multispectral photoacoustic coded excitation (MSPACE). Pseudorandom sequences based upon maximal length shift register sequences (m-sequences) are introduced and analyzed for the purpose of use in MSPACE. Their gain in SNR exceeds that of orthogonal Golay codes for finite code lengths. Artefacts are introduced, but may remain invisible depending on SNR and code length.

  1. Maximal dinucleotide comma-free codes.

    PubMed

    Fimmel, Elena; Strüngmann, Lutz

    2016-01-21

    The problem of retrieval and maintenance of the correct reading frame plays a significant role in RNA transcription. Circular codes, and especially comma-free codes, can help to understand the underlying mechanisms of error-detection in this process. In recent years much attention has been paid to the investigation of trinucleotide circular codes (see, for instance, Fimmel et al., 2014; Fimmel and Strüngmann, 2015a; Michel and Pirillo, 2012; Michel et al., 2012, 2008), while dinucleotide codes had been touched on only marginally, even though dinucleotides are associated to important biological functions. Recently, all maximal dinucleotide circular codes were classified (Fimmel et al., 2015; Michel and Pirillo, 2013). The present paper studies maximal dinucleotide comma-free codes and their close connection to maximal dinucleotide circular codes. We give a construction principle for such codes and provide a graphical representation that allows them to be visualized geometrically. Moreover, we compare the results for dinucleotide codes with the corresponding situation for trinucleotide maximal self-complementary C(3)-codes. Finally, the results obtained are discussed with respect to Crick?s hypothesis about frame-shift-detecting codes without commas. PMID:26562635

  2. Investigation of the Performance of Various CVD Diamond Crystal Qualities for the Measurement of Radiation Doses from a Low Energy Mammography X-Ray Beam, Compared with MC Code (PENELOPE) Calculations

    NASA Astrophysics Data System (ADS)

    Zakari, Y. I.; Mavunda, R. D.; Nam, T. L.; Keddy, R. J.

    The tissue equivalence of diamond allows for accurate radiation dose determination without large corrections for different attenuation values in biological tissue, but its low Z value limits this advantage however to the lower energy photons such as for example in Mammography X-ray beams. This paper assays the performance of nine Chemical Vapour Deposition (CVD) diamonds for use as radiation sensing material. The specimens fabricated in wafer form are classified as detector grade, optical grade and single crystals. It is well known that the presence of defects in diamonds, including CVD specimens, not only dictates but also affects the responds of diamond to radiation in different ways. In this investigation, tools such as electron spin resonance (ESR), thermoluminescence (TL) Raman spectroscopy and ultra violet (UV) spectroscopy were used to probe each of the samples. The linearity, sensitivity and other characteristics of the detector to photon interaction was analyzed, and from the I-V characteristics. The diamonds categorized into four each, of the so called Detector and Optical grades, and a single crystal CVD were exposed to low X-ray peak voltage range (22 to 27 KVp) with a trans-crystal polarizing fields of 0.4 kV.cm-1, 0.66 kV.cm-1 and 0.8 kV.cm-1. The presentation discusses the presence of defects identifiable by the techniques used and correlates the radiation performance of the three types of crystals to their presence. The choice of a wafer as either a spectrometer or as X-ray dosimeter within the selected energy range was made. The analyses was validated with Monte-Carlo code (PENELOPE)

  3. Multiple Turbo Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    A description is given of multiple turbo codes and a suitable decoder structure derived from an approximation to the maximum a posteriori probability (MAP) decision rule, which is substantially different from the decoder for two-code-based encoders.

  4. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  5. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  6. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  7. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  8. Frame independent cosmological perturbations

    SciTech Connect

    Prokopec, Tomislav; Weenink, Jan E-mail: j.g.weenink@uu.nl

    2013-09-01

    We compute the third order gauge invariant action for scalar-graviton interactions in the Jordan frame. We demonstrate that the gauge invariant action for scalar and tensor perturbations on one physical hypersurface only differs from that on another physical hypersurface via terms proportional to the equation of motion and boundary terms, such that the evolution of non-Gaussianity may be called unique. Moreover, we demonstrate that the gauge invariant curvature perturbation and graviton on uniform field hypersurfaces in the Jordan frame are equal to their counterparts in the Einstein frame. These frame independent perturbations are therefore particularly useful in relating results in different frames at the perturbative level. On the other hand, the field perturbation and graviton on uniform curvature hypersurfaces in the Jordan and Einstein frame are non-linearly related, as are their corresponding actions and n-point functions.

  9. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  10. Performance comparison of combined ECC/RLL codes

    NASA Technical Reports Server (NTRS)

    French, C.; Lin, Y.

    1990-01-01

    In this paper, we present a performance comparison of several combined error correcting/run-lenth limited (ECC/RLL) codes created by concatenating a convolutional code with a run-length limited code. In each case, encoding and decoding are accomplished using a single trellis based on the combined code. Half of the codes under investigation use conventionally (d,k) run-length limited codes, where d is the minimum and k is the maximum allowable run of 0's between 1's. The other half of the combined codes use a special class of (d,k) codes known as distance preserving codes. These codes have the property that pairwise Hamming distances out of the (d,k) encoder are at least as large as the corresponding distances into the encoder (i.e., the codes preserve distance). Thus a combined code, created using a convolutional code concatenated with a distance preserving (d,k) code, will have a free distance (dfree) no smaller than the free distance of the original convolutional code. It should be noted that this does not hold if the (d,k) code was not distance preserving. A computer simulation is used to compare the performance of these two types of codes over the binary symmetric channel for various (d,k) constraints, rates, free distances, and numbers of states. Of particular interest for magnetic recording applications are codes with run-length constraints (1,3), (1,7), and (2,7).

  11. Diagnosis code assignment: models and evaluation metrics

    PubMed Central

    Perotte, Adler; Pivovarov, Rimma; Natarajan, Karthik; Weiskopf, Nicole; Wood, Frank; Elhadad, Noémie

    2014-01-01

    Background and objective The volume of healthcare data is growing rapidly with the adoption of health information technology. We focus on automated ICD9 code assignment from discharge summary content and methods for evaluating such assignments. Methods We study ICD9 diagnosis codes and discharge summaries from the publicly available Multiparameter Intelligent Monitoring in Intensive Care II (MIMIC II) repository. We experiment with two coding approaches: one that treats each ICD9 code independently of each other (flat classifier), and one that leverages the hierarchical nature of ICD9 codes into its modeling (hierarchy-based classifier). We propose novel evaluation metrics, which reflect the distances among gold-standard and predicted codes and their locations in the ICD9 tree. Experimental setup, code for modeling, and evaluation scripts are made available to the research community. Results The hierarchy-based classifier outperforms the flat classifier with F-measures of 39.5% and 27.6%, respectively, when trained on 20?533 documents and tested on 2282 documents. While recall is improved at the expense of precision, our novel evaluation metrics show a more refined assessment: for instance, the hierarchy-based classifier identifies the correct sub-tree of gold-standard codes more often than the flat classifier. Error analysis reveals that gold-standard codes are not perfect, and as such the recall and precision are likely underestimated. Conclusions Hierarchy-based classification yields better ICD9 coding than flat classification for MIMIC patients. Automated ICD9 coding is an example of a task for which data and tools can be shared and for which the research community can work together to build on shared models and advance the state of the art. PMID:24296907

  12. Morse Code Activity Packet.

    ERIC Educational Resources Information Center

    Clinton, Janeen S.

    This activity packet offers simple directions for setting up a Morse Code system appropriate to interfacing with any of several personal computer systems. Worksheets are also included to facilitate teaching Morse Code to persons with visual or other disabilities including blindness, as it is argued that the code is best learned auditorily. (PB)

  13. Structural welding code - steel

    SciTech Connect

    Not Available

    1983-01-01

    This code covers welding requirement applicable to any type of welded structure and is designed to be used in conjunction with any complementary code or specification for the design and construction of steel structures. The 1982 edition contains new provisions, revisions of current material, and a new stud welding section, and some rearrangement of the 1981 code.

  14. Improvement in elastographic signal-to-noise ratio and resolution using coded excitation in elastography

    NASA Astrophysics Data System (ADS)

    Souchon, Remi; Bera, Jean-Christophe; Pousse, Agnes; Chapelon, Jean-Yves

    2001-05-01

    Coded excitation has been used in conventional sonography to improve the sonographic signal-to-noise ratio independently of resolution. In elastography, a tradeoff exists between spatial resolution and the elastographic signal-to-noise ratio (SNRe). In the present work, the use of coded excitation was investigated to remove this ambiguity in elastography. Both numerical simulations and phantom experiments were carried out to estimate the SNRe in a homogeneous material and the resolution in a material containing calibrated inclusions. These results were compared with those obtained using conventional pulse excitation. Various codes (Golay, Barker, chirp) and code lengths were tested. The numerical simulations used a simple 1D backscattering model to show the theoretical effects of coded excitation. Experiments were carried out using a more realistic setup based on a sector-scan imaging probe. In the absence of sonographic noise, simulations showed that codes induced only a slight decrease in SNRe at no cost in resolution. When sonographic noise was added into the model, a large improvement in SNRe was obtained at constant resolution. The experimental results corroborated these findings. [Work supported in part by National Cancer Institute (USA) Program Project Grant P01-CA64597.

  15. Description of ground motion data processing codes: Volume 3

    SciTech Connect

    Sanders, M.L.

    1988-02-01

    Data processing codes developed to process ground motion at the Nevada Test Site for the Weapons Test Seismic Investigations Project are used today as part of the program to process ground motion records for the Nevada Nuclear Waste Storage Investigations Project. The work contained in this report documents and lists codes and verifies the ``PSRV`` code. 39 figs.

  16. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  17. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  18. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  19. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  20. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  1. To code, or not to code?

    PubMed

    Parman, Cindy C

    2003-01-01

    In summary, it is also important to remember the hidden rules: 1) Just because there is a code in the manual, it doesn't mean it can be billed to insurance, or that once billed, it will be reimbursed. 2) Just because a code was paid once, doesn't mean it will ever be paid again--or that you get to keep the money! 3) The healthcare provider is responsible for knowing all the rules, but then it is impossible to know all the rules! And not knowing all the rules can lead to fines, penalties or worse! New codes are added annually (quarterly for OPPS), definitions of existing codes are changed, and it is the responsibility of healthcare providers to keep abreast of all coding updates and changes. In addition, the federal regulations are constantly updated and changed, making compliant billing a moving target. All healthcare entities should focus on complete documentation, the adherence to authoritative coding guidance and the provision of detailed explanations and specialty education to the payor, as necessary. PMID:14619987

  2. Bit-Wise Arithmetic Coding For Compression Of Data

    NASA Technical Reports Server (NTRS)

    Kiely, Aaron

    1996-01-01

    Bit-wise arithmetic coding is data-compression scheme intended especially for use with uniformly quantized data from source with Gaussian, Laplacian, or similar probability distribution function. Code words of fixed length, and bits treated as being independent. Scheme serves as means of progressive transmission or of overcoming buffer-overflow or rate constraint limitations sometimes arising when data compression used.

  3. Coding for Electronic Mail

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Lee, J. J.

    1986-01-01

    Scheme for coding facsimile messages promises to reduce data transmission requirements to one-tenth current level. Coding scheme paves way for true electronic mail in which handwritten, typed, or printed messages or diagrams sent virtually instantaneously - between buildings or between continents. Scheme, called Universal System for Efficient Electronic Mail (USEEM), uses unsupervised character recognition and adaptive noiseless coding of text. Image quality of resulting delivered messages improved over messages transmitted by conventional coding. Coding scheme compatible with direct-entry electronic mail as well as facsimile reproduction. Text transmitted in this scheme automatically translated to word-processor form.

  4. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  5. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  6. DLLExternalCode

    SciTech Connect

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  7. DLLExternalCode

    Energy Science and Technology Software Center (ESTSC)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read frommore »files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.« less

  8. Wear Independent Similarity.

    PubMed

    Steele, Adam; Davis, Alexander; Kim, Joohyung; Loth, Eric; Bayer, Ilker S

    2015-06-17

    This study presents a new factor that can be used to design materials where desired surface properties must be retained under in-system wear and abrasion. To demonstrate this factor, a synthetic nonwetting coating is presented that retains chemical and geometric performance as material is removed under multiple wear conditions: a coarse vitrified abradant (similar to sanding), a smooth abradant (similar to rubbing), and a mild abradant (a blend of sanding and rubbing). With this approach, such a nonwetting material displays unprecedented mechanical durability while maintaining desired performance under a range of demanding conditions. This performance, herein termed wear independent similarity performance (WISP), is critical because multiple mechanisms and/or modes of wear can be expected to occur in many typical applications, e.g., combinations of abrasion, rubbing, contact fatigue, weathering, particle impact, etc. Furthermore, these multiple wear mechanisms tend to quickly degrade a novel surface's unique performance, and thus many promising surfaces and materials never scale out of research laboratories. Dynamic goniometry and scanning electron microscopy results presented herein provide insight into these underlying mechanisms, which may also be applied to other coatings and materials. PMID:26018058

  9. Studying the Independent School Library

    ERIC Educational Resources Information Center

    Cahoy, Ellysa Stern; Williamson, Susan G.

    2008-01-01

    In 2005, the American Association of School Librarians' Independent Schools Section conducted a national survey of independent school libraries. This article analyzes the results of the survey, reporting specialized data and information regarding independent school library budgets, collections, services, facilities, and staffing. Additionally, the…

  10. cncRNAs: Bi-functional RNAs with protein coding and non-coding functions

    PubMed Central

    Kumari, Pooja; Sampath, Karuna

    2015-01-01

    For many decades, the major function of mRNA was thought to be to provide protein-coding information embedded in the genome. The advent of high-throughput sequencing has led to the discovery of pervasive transcription of eukaryotic genomes and opened the world of RNA-mediated gene regulation. Many regulatory RNAs have been found to be incapable of protein coding and are hence termed as non-coding RNAs (ncRNAs). However, studies in recent years have shown that several previously annotated non-coding RNAs have the potential to encode proteins, and conversely, some coding RNAs have regulatory functions independent of the protein they encode. Such bi-functional RNAs, with both protein coding and non-coding functions, which we term as ‘cncRNAs’, have emerged as new players in cellular systems. Here, we describe the functions of some cncRNAs identified from bacteria to humans. Because the functions of many RNAs across genomes remains unclear, we propose that RNAs be classified as coding, non-coding or both only after careful analysis of their functions. PMID:26498036

  11. 76 FR 41585 - Regulation and Independent Regulatory Agencies

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-14

    ..., integration and innovation, flexible approaches, and science. To the extent permitted by law, independent....) THE WHITE HOUSE, July 11, 2011. [FR Doc. 2011-17953 Filed 7-13-11; 11:15 am] Billing code 3195-W1-P ... From the Federal Register Online via the Government Publishing Office ] Vol. 76 Thursday, No....

  12. DRG benchmarking study establishes national coding norms.

    PubMed

    Vaul, J H

    1998-05-01

    With the increase in fraud and abuse investigations, healthcare financial managers should examine their organization's medical record coding procedures. The Federal government and third-party payers are looking specifically for improper billing of outpatient services, unbundling of procedures to increase payment, assigning higher-paying DRG codes for inpatient claims, and other abuses. A recent benchmarking study of Medicare Provider Analysis and Review (MEDPAR) data has established national norms for hospital coding and case mix based on DRGs and has revealed the majority of atypical coding cases fall into six DRG pairs. Organizations with a greater percentage of atypical cases--those more likely to be scrutinized by Federal investigators--will want to conduct suitable review and be sure appropriate documentation exists to justify the coding. PMID:10179440

  13. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the existing beam port configuration of the Penn State Breazeale Reactor (PSBR) was designed to test and validate the code package in its entirety, as well as its modules separately. The selected physics code, TORT, and the requisite data such as source distribution, cross-sections, and angular quadratures were comprehensively tested with these computational models. The modular feature and the parallel performance of the code package were also examined using these computational models. Another outcome of these computational models is to provide the necessary background information for determining the optimal shape of the D2O moderator tank for the new beam tube configurations for the PSBR's beam port facility. The first mission of the code package was completed successfully by determining the optimal tank shape which was sought for the current beam tube configuration and two new beam tube configurations for the PSBR's beam port facility. The performance of the new beam tube configurations and the current beam tube configuration were evaluated with the new optimal tank shapes determined by MOZAIK. Furthermore, the performance of the code package with the two different optimization strategies were analyzed showing that while GA is capable of achieving higher thermal beam intensity for a given beam tube setup, Min-max produces an optimal shape that is more amenable to machining and manufacturing. The optimal D2O moderator tank shape determined by MOZAIK with the current beam port configuration improves the thermal neutron beam intensity at the beam port exit end by 9.5%. Similarly, the new tangential beam port configuration (beam port near the core interface) with the optimal moderator tank shape determined by MOZAIK improves the thermal neutron beam intensity by a factor of 1.4 compared to the existing beam port configuration (with the existing D2O moderator tank). Another new beam port configuration, radial beam tube configuration, with the optimal moderator tank shape increases the thermal neutron beam intensity at the beam tube exit by a factor of 1.8. All these results

  14. Adaptive entropy coded subband coding of images.

    PubMed

    Kim, Y H; Modestino, J W

    1992-01-01

    The authors describe a design approach, called 2-D entropy-constrained subband coding (ECSBC), based upon recently developed 2-D entropy-constrained vector quantization (ECVQ) schemes. The output indexes of the embedded quantizers are further compressed by use of noiseless entropy coding schemes, such as Huffman or arithmetic codes, resulting in variable-rate outputs. Depending upon the specific configurations of the ECVQ and the ECPVQ over the subbands, many different types of SBC schemes can be derived within the generic 2-D ECSBC framework. Among these, the authors concentrate on three representative types of 2-D ECSBC schemes and provide relative performance evaluations. They also describe an adaptive buffer instrumented version of 2-D ECSBC, called 2-D ECSBC/AEC, for use with fixed-rate channels which completely eliminates buffer overflow/underflow problems. This adaptive scheme achieves performance quite close to the corresponding ideal 2-D ECSBC system. PMID:18296138

  15. Melanism in Peromyscus Is Caused by Independent Mutations in Agouti

    PubMed Central

    Kingsley, Evan P.; Manceau, Marie; Wiley, Christopher D.; Hoekstra, Hopi E.

    2009-01-01

    Identifying the molecular basis of phenotypes that have evolved independently can provide insight into the ways genetic and developmental constraints influence the maintenance of phenotypic diversity. Melanic (darkly pigmented) phenotypes in mammals provide a potent system in which to study the genetic basis of naturally occurring mutant phenotypes because melanism occurs in many mammals, and the mammalian pigmentation pathway is well understood. Spontaneous alleles of a few key pigmentation loci are known to cause melanism in domestic or laboratory populations of mammals, but in natural populations, mutations at one gene, the melanocortin-1 receptor (Mc1r), have been implicated in the vast majority of cases, possibly due to its minimal pleiotropic effects. To investigate whether mutations in this or other genes cause melanism in the wild, we investigated the genetic basis of melanism in the rodent genus Peromyscus, in which melanic mice have been reported in several populations. We focused on two genes known to cause melanism in other taxa, Mc1r and its antagonist, the agouti signaling protein (Agouti). While variation in the Mc1r coding region does not correlate with melanism in any population, in a New Hampshire population, we find that a 125-kb deletion, which includes the upstream regulatory region and exons 1 and 2 of Agouti, results in a loss of Agouti expression and is perfectly associated with melanic color. In a second population from Alaska, we find that a premature stop codon in exon 3 of Agouti is associated with a similar melanic phenotype. These results show that melanism has evolved independently in these populations through mutations in the same gene, and suggest that melanism produced by mutations in genes other than Mc1r may be more common than previously thought. PMID:19649329

  16. The Comparative Performance of Conditional Independence Indices

    ERIC Educational Resources Information Center

    Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.

    2011-01-01

    To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The…

  17. Narrative compression coding for a channel with errors

    NASA Astrophysics Data System (ADS)

    Bond, James W.

    1988-01-01

    Data compression codes offer the possibility of improving the thruput of existing communication systems in the near term. This study was undertaken to determine if data compression codes could be utilized to provide message compression in a channel with up to a 0.10 bit error rate. The data compression capabilities of codes were investigated by estimating the average number of bits-per-character required to transmit narrative files. The performance of the codes in a channel with errors (a noisy channel) was investigated in terms of the average numbers of characters-decoded-in-error and of characters-printed-in-error-per-bit-error. Results were obtained by encoding four narrative files, which were resident on an IBM-PC and use a 58 character set. The study focused on Huffman codes and suffix/prefix comma-free codes. Other data compression codes, in particular, block codes and some simple variants of block codes, are briefly discussed to place the study results in context. Comma-free codes were found to have the most promising data compression because error propagation due to bit errors are limited to a few characters for these codes. A technique was found to identify a suffix/prefix comma-free code giving nearly the same data compressions as a Huffman code with much less error propagation than the Huffman codes. Greater data compression can be achieved through the use of this comma-free code word assignments based on conditioned probabilities of character occurrence.

  18. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  19. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  20. Updating the Read Codes

    PubMed Central

    Robinson, David; Comp, Dip; Schulz, Erich; Brown, Philip; Price, Colin

    1997-01-01

    Abstract The Read Codes are a hierarchically-arranged controlled clinical vocabulary introduced in the early 1980s and now consisting of three maintained versions of differing complexity. The code sets are dynamic, and are updated quarterly in response to requests from users including clinicians in both primary and secondary care, software suppliers, and advice from a network of specialist healthcare professionals. The codes' continual evolution of content, both across and within versions, highlights tensions between different users and uses of coded clinical data. Internal processes, external interactions and new structural features implemented by the NHS Centre for Coding and Classification (NHSCCC) for user interactive maintenance of the Read Codes are described, and over 2000 items of user feedback episodes received over a 15-month period are analysed. PMID:9391934

  1. Overview of Code Verification

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  2. Industrial Code Development

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1991-01-01

    The industrial codes will consist of modules of 2-D and simplified 2-D or 1-D codes, intended for expeditious parametric studies, analysis, and design of a wide variety of seals. Integration into a unified system is accomplished by the industrial Knowledge Based System (KBS), which will also provide user friendly interaction, contact sensitive and hypertext help, design guidance, and an expandable database. The types of analysis to be included with the industrial codes are interfacial performance (leakage, load, stiffness, friction losses, etc.), thermoelastic distortions, and dynamic response to rotor excursions. The first three codes to be completed and which are presently being incorporated into the KBS are the incompressible cylindrical code, ICYL, and the compressible cylindrical code, GCYL.

  3. Industrial Computer Codes

    NASA Technical Reports Server (NTRS)

    Shapiro, Wilbur

    1996-01-01

    This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.

  4. Experimental evaluation of photoacoustic coded excitation using unipolar golay codes.

    PubMed

    Mienkina, Martin P; Friedrich, Claus-Stefan; Gerhardt, Nils C; Wilkening, Wilko G; Hofmann, Martin R; Schmitz, Georg

    2010-07-01

    Q-switched Nd:YAG lasers are commonly used as light sources for photoacoustic imaging. However, laser diodes are attractive as an alternative to Nd:YAG lasers because they are less expensive and more compact. Although laser diodes deliver about three orders of magnitude less light pulse energy than Nd:YAG lasers (tens of microjoules compared with tens of millijoules), their pulse repetition frequency (PRF) is four to five orders of magnitude higher (up to 1 MHz compared with tens of hertz); this enables the use of averaging to improve SNR without compromising the image acquisition rate. In photoacoustic imaging, the PRF is limited by the maximum acoustic time-of-flight. This limit can be overcome by using coded excitation schemes in which the coding eliminates ambiguities between echoes induced by subsequent pulses. To evaluate the benefits of photoacoustic coded excitation (PACE), the performance of unipolar Golay codes is investigated analytically and validated experimentally. PACE imaging of a copper slab using laser diodes at a PRF of 1 MHz and a modified clinical ultrasound scanner is successfully demonstrated. Considering laser safety regulations and taking into account a comparison between a laser diode system and Nd:YAG systems with respect to SNR, we conclude that PACE is feasible for small animal imaging. PMID:20639152

  5. GALPROP: New Developments in CR Propagation Code

    NASA Technical Reports Server (NTRS)

    Moskalenko, I. V.; Jones, F. C.; Mashnik, S. G.; Strong, A. W.; Ptuskin, V. S.

    2003-01-01

    The numerical Galactic CR propagation code GALPROP has been shown to reproduce simultaneously observational data of many kinds related to CR origin and propagation. It has been validated on direct measurements of nuclei, antiprotons, electrons, positrons as well as on astronomical measurements of gamma rays and synchrotron radiation. Such data provide many independent constraints on model parameters while revealing some contradictions in the conventional view of Galactic CR propagation. Using a new version of GALPROP we study new effects such as processes of wave-particle interactions in the interstellar medium. We also report about other developments in the CR propagation code.

  6. Topological subsystem codes

    SciTech Connect

    Bombin, H.

    2010-03-15

    We introduce a family of two-dimensional (2D) topological subsystem quantum error-correcting codes. The gauge group is generated by two-local Pauli operators, so that two-local measurements are enough to recover the error syndrome. We study the computational power of code deformation in these codes and show that boundaries cannot be introduced in the usual way. In addition, we give a general mapping connecting suitable classical statistical mechanical models to optimal error correction in subsystem stabilizer codes that suffer from depolarizing noise.

  7. Expander chunked codes

    NASA Astrophysics Data System (ADS)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  8. Transonic airfoil codes

    NASA Technical Reports Server (NTRS)

    Garabedian, P. R.

    1979-01-01

    Computer codes for the design and analysis of transonic airfoils are considered. The design code relies on the method of complex characteristics in the hodograph plane to construct shockless airfoil. The analysis code uses artificial viscosity to calculate flows with weak shock waves at off-design conditions. Comparisons with experiments show that an excellent simulation of two dimensional wind tunnel tests is obtained. The codes have been widely adopted by the aircraft industry as a tool for the development of supercritical wing technology.

  9. FAA Smoke Transport Code

    SciTech Connect

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  10. Bar Code Labels

    NASA Technical Reports Server (NTRS)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  11. P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code

    NASA Technical Reports Server (NTRS)

    Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)

    2000-01-01

    In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

  12. Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity

    ERIC Educational Resources Information Center

    Zuercher, Kenneth

    2009-01-01

    From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

  13. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  14. Fighting for independence.

    PubMed

    Saxon, Emma

    2016-01-01

    Male crickets (Gryllus bimaculatus) establish dominance hierarchies within a population by fighting with one another. Larger males win fights more frequently than their smaller counterparts, and a previous study found that males recognise one another primarily through sensory input from the antennae. This study therefore investigated whether the success of larger crickets is influenced by sensory input from the antennae, in part by assessing the number of fights that large 'antennectomized' crickets won against small crickets, compared with the number that large, intact crickets won. The success rate was significantly lower in antennectomized males, though they still won the majority of fights (73/100 versus 58/100, Fisher's exact test P?

  15. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  16. Prostate Surgery Codes

    Cancer.gov

    Prostate C619 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Do not code an orchiectomy in this field. For prostate primaries, orchiectomies are coded in the data item “Hematologic Transplant and

  17. Numerical Transport Codes

    SciTech Connect

    Ongena, J.P.H.E.; Evrard, M.; McCune, D

    2004-03-15

    This paper gives a brief introduction on numerical transport codes. The relevant equations which are used in these codes are established, and on the basis of these equations, the necessary calculations needed to resolve them are pointed out. Finally, some examples are given, illustrating their application.

  18. Insurance billing and coding.

    PubMed

    Napier, Rebecca H; Bruelheide, Lori S; Demann, Eric T K; Haug, Richard H

    2008-07-01

    The purpose of this article is to highlight the importance of understanding various numeric and alpha-numeric codes for accurately billing dental and medically related services to private pay or third-party insurance carriers. In the United States, common dental terminology (CDT) codes are most commonly used by dentists to submit claims, whereas current procedural terminology (CPT) and International Classification of Diseases, Ninth Revision, Clinical Modification (ICD.9.CM) codes are more commonly used by physicians to bill for their services. The CPT and ICD.9.CM coding systems complement each other in that CPT codes provide the procedure and service information and ICD.9.CM codes provide the reason or rationale for a particular procedure or service. These codes are more commonly used for "medical necessity" determinations, and general dentists and specialists who routinely perform care, including trauma-related care, biopsies, and dental treatment as a result of or in anticipation of a cancer-related treatment, are likely to use these codes. Claim submissions for care provided can be completed electronically or by means of paper forms. PMID:18501731

  19. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang (Athens, GA); Ljungdahl, Lars G. (Athens, GA); Chen, Huizhong (Lawrenceville, GA)

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  20. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  1. Dress Codes for Teachers?

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    In this article, the author discusses an e-mail survey of principals from across the country regarding whether or not their school had a formal staff dress code. The results indicate that most did not have a formal dress code, but agreed that professional dress for teachers was not only necessary, but showed respect for the school and had a…

  2. Narrative compression coding for a channel with errors

    NASA Astrophysics Data System (ADS)

    Bond, James W.

    1988-12-01

    Data compression codes offer the possibility of improving the throughput of existing communication systems in the near term. This study was undertaken to determine if data compression codes could be utilized to provide message compression in a channel with up to a .10 bit error rate. The data compression capabilities of codes were investigated by estimating the average number of bits-per-character required to transmit narrative files. The performance of the codes in a channel with errors (a noisy channel) was investigated in terms of the average numbers of characters decoded in error per bit error and of characters printed in error per bit error. Results were obtained by encoding four narrative files, which were resident on an IBM PC and use a 58 character set. The study focused on Huffman codes and suffix/prefix comma-free codes. Other data compression codes, in particular, block codes and some simple variants of block codes, are briefly discussed to place the study results in context. Comma-free codes were found to have the most promising data compression because error propagation due to bit errors are limited to a few characters for these codes. A technique was found to identify a suffix/prefix comma-free code giving nearly the same data compression as a Huffman code with much less error propagation than the Huffman codes.

  3. Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels

    NASA Technical Reports Server (NTRS)

    Moher, Michael L.; Lodge, John H.

    1990-01-01

    A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.

  4. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  5. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  6. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  7. Value of Laboratory Experiments for Code Validations

    SciTech Connect

    Wawersik, W.R.

    1998-12-14

    Numerical codes have become indispensable for designing underground structures and interpretating the behavior of geologic systems. Because of the complexities of geologic systems, however, code calculations often are associated with large quantitative uncertainties. This papers presents three examples to demonstrate the value of laboratory(or bench scale) experiments to evaluate the predictive capabilities of such codes with five major conclusions: Laboratory or bench-scale experiments are a very cost-effective, controlled means of evaluating and validating numerical codes, not instead of but before or at least concurrent with the implementation of in situ studies. The design of good laboratory validation tests must identifj what aspects of a code are to be scrutinized in order to optimize the size, geometry, boundary conditions, and duration of the experiments. The design of good and sometimes difficult numerical analyses and sensitivity studies. Laboratory validation tests must involve: Good validation experiments will generate independent data sets to identify the combined effect of constitutive models, model generalizations, material parameters, and numerical algorithms. Successfid validations of numerical codes mandate a close collaboration between experimentalists and analysts drawing from the full gamut of observations, measurements, and mathematical results.

  8. A robust low-rate coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (editor)

    1991-01-01

    Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.

  9. Independent Study, an Annotated Bibliography.

    ERIC Educational Resources Information Center

    Davis, Harold S.

    This annotated bibliography on independent study lists 150 books, pamphlets, and articles published between 1929 and 1966, with most of the entries dated after 1960. Entries also cover independent study in relation to team teaching, nongraded schools, instructional materials centers, individualized instruction, flexible scheduling, curriculum…

  10. Independent Learning Models: A Comparison.

    ERIC Educational Resources Information Center

    Wickett, R. E. Y.

    Five models of independent learning are suitable for use in adult education programs. The common factor is a facilitator who works in some way with the student in the learning process. They display different characteristics, including the extent of independence in relation to content and/or process. Nondirective tutorial instruction and learning…

  11. Region-based fractal video coding

    NASA Astrophysics Data System (ADS)

    Zhu, Shiping; Belloulata, Kamel

    2008-10-01

    A novel video sequence compression scheme is proposed in order to realize the efficient and economical transmission of video sequence, and also the region-based functionality of MPEG-4. The CPM and NCIM fractal coding scheme is applied on each region independently by a prior image segmentation map (alpha plane) which is exactly the same as defined in MPEG-4. The first n frames of video sequence are encoded as a "set" using the Circular Prediction Mapping (CPM) and encode the remaining frames using the Non Contractive Interframe Mapping (NCIM). The CPM and NCIM accomplish the motion estimation and compensation, which can exploit the high temporal correlations between the adjacent frames of video sequence. The experimental results with the monocular video sequences provide promising performances at low bit rate coding, such as the application in video conference. We believe the proposed fractal video codec will be a powerful and efficient technique for the region-based video sequence coding.

  12. Ideal Binocular Disparity Detectors Learned Using Independent Subspace Analysis on Binocular Natural Image Pairs

    PubMed Central

    Hunter, David W.; Hibbard, Paul B.

    2016-01-01

    An influential theory of mammalian vision, known as the efficient coding hypothesis, holds that early stages in the visual cortex attempts to form an efficient coding of ecologically valid stimuli. Although numerous authors have successfully modelled some aspects of early vision mathematically, closer inspection has found substantial discrepancies between the predictions of some of these models and observations of neurons in the visual cortex. In particular analysis of linear-non-linear models of simple-cells using Independent Component Analysis has found a strong bias towards features on the horoptor. In order to investigate the link between the information content of binocular images, mathematical models of complex cells and physiological recordings, we applied Independent Subspace Analysis to binocular image patches in order to learn a set of complex-cell-like models. We found that these complex-cell-like models exhibited a wide range of binocular disparity-discriminability, although only a minority exhibited high binocular discrimination scores. However, in common with the linear-non-linear model case we found that feature detection was limited to the horoptor suggesting that current mathematical models are limited in their ability to explain the functionality of the visual cortex. PMID:26982184

  13. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  14. Pyramid image codes

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.

    1990-01-01

    All vision systems, both human and machine, transform the spatial image into a coded representation. Particular codes may be optimized for efficiency or to extract useful image features. Researchers explored image codes based on primary visual cortex in man and other primates. Understanding these codes will advance the art in image coding, autonomous vision, and computational human factors. In cortex, imagery is coded by features that vary in size, orientation, and position. Researchers have devised a mathematical model of this transformation, called the Hexagonal oriented Orthogonal quadrature Pyramid (HOP). In a pyramid code, features are segregated by size into layers, with fewer features in the layers devoted to large features. Pyramid schemes provide scale invariance, and are useful for coarse-to-fine searching and for progressive transmission of images. The HOP Pyramid is novel in three respects: (1) it uses a hexagonal pixel lattice, (2) it uses oriented features, and (3) it accurately models most of the prominent aspects of primary visual cortex. The transform uses seven basic features (kernels), which may be regarded as three oriented edges, three oriented bars, and one non-oriented blob. Application of these kernels to non-overlapping seven-pixel neighborhoods yields six oriented, high-pass pyramid layers, and one low-pass (blob) layer.

  15. Applications of numerical codes to space plasma problems

    NASA Technical Reports Server (NTRS)

    Northrop, T. G.; Birmingham, T. J.; Jones, F. C.; Wu, C. S.

    1975-01-01

    Solar wind, earth's bowshock, and magnetospheric convection and substorms were investigated. Topics discussed include computational physics, multifluid codes, ionospheric irregularities, and modeling laser plasmas.

  16. Local intensity adaptive image coding

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1989-01-01

    The objective of preprocessing for machine vision is to extract intrinsic target properties. The most important properties ordinarily are structure and reflectance. Illumination in space, however, is a significant problem as the extreme range of light intensity, stretching from deep shadow to highly reflective surfaces in direct sunlight, impairs the effectiveness of standard approaches to machine vision. To overcome this critical constraint, an image coding scheme is being investigated which combines local intensity adaptivity, image enhancement, and data compression. It is very effective under the highly variant illumination that can exist within a single frame or field of view, and it is very robust to noise at low illuminations. Some of the theory and salient features of the coding scheme are reviewed. Its performance is characterized in a simulated space application, the research and development activities are described.

  17. Axisymmetric generalized harmonic evolution code

    NASA Astrophysics Data System (ADS)

    Sorkin, Evgeny

    2010-04-01

    We describe the first axisymmetric numerical code based on the generalized harmonic formulation of the Einstein equations, which is regular at the axis. We test the code by investigating gravitational collapse of distributions of complex scalar field in a Kaluza-Klein spacetime. One of the key issues of the harmonic formulation is the choice of the gauge source functions, and we conclude that a damped-wave gauge is remarkably robust in this case. Our preliminary study indicates that evolution of regular initial data leads to formation both of black holes with spherical and cylindrical horizon topologies. Intriguingly, we find evidence that near threshold for black hole formation the number of outcomes proliferates. Specifically, the collapsing matter splits into individual pulses, two of which travel in the opposite directions along the compact dimension and one which is ejected radially from the axis. Depending on the initial conditions, a curvature singularity develops inside the pulses.

  18. Axisymmetric generalized harmonic evolution code

    SciTech Connect

    Sorkin, Evgeny

    2010-04-15

    We describe the first axisymmetric numerical code based on the generalized harmonic formulation of the Einstein equations, which is regular at the axis. We test the code by investigating gravitational collapse of distributions of complex scalar field in a Kaluza-Klein spacetime. One of the key issues of the harmonic formulation is the choice of the gauge source functions, and we conclude that a damped-wave gauge is remarkably robust in this case. Our preliminary study indicates that evolution of regular initial data leads to formation both of black holes with spherical and cylindrical horizon topologies. Intriguingly, we find evidence that near threshold for black hole formation the number of outcomes proliferates. Specifically, the collapsing matter splits into individual pulses, two of which travel in the opposite directions along the compact dimension and one which is ejected radially from the axis. Depending on the initial conditions, a curvature singularity develops inside the pulses.

  19. Optimal source codes for geometrically distributed integer alphabets

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  20. INVESTIGATION OF FISCALLY INDEPENDENT AND DEPENDENT CITY SCHOOL DISTRICTS.

    ERIC Educational Resources Information Center

    GITTELL, MARILYN; AND OTHERS

    A TWO-PART COMPARATIVE ANALYSIS IS MADE OF LARGE AND SMALL CITY SCHOOL SYSTEMS. PART I ANALYZES A WIDE RANGE OF FISCAL AND NON-FISCAL VARIABLES ASSOCIATED WITH FISCAL STATUS OF CITY SCHOOL SYSTEMS. IT COVERS THE 2,788 CITY SCHOOL DISTRICTS IN THE UNITED STATES WITH ENROLLMENTS OVER 3,000. COMPLEX INTERRELATIONSHIPS SURROUNDING FISCAL STATUS IN…

  1. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  2. Population coding of affect across stimuli, modalities and individuals.

    PubMed

    Chikazoe, Junichi; Lee, Daniel H; Kriegeskorte, Nikolaus; Anderson, Adam K

    2014-08-01

    It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them--affect. Representational mapping of population activity evoked by complex scenes and basic tastes in humans revealed a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. Although ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC) maintained a valence code independent of sensory origin. Furthermore, only the OFC code could classify experienced affect across participants. The entire valence spectrum was represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities and people. PMID:24952643

  3. Population coding of affect across stimuli, modalities and individuals

    PubMed Central

    Chikazoe, Junichi; Lee, Daniel H.; Kriegeskorte, Nikolaus; Anderson, Adam K.

    2014-01-01

    It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population level activity evoked by complex scenes and basic tastes uncovered a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. While ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC), maintained a valence code independent of sensory origin. Further only the OFC code could classify experienced affect across participants. The entire valence spectrum is represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities, and people. PMID:24952643

  4. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Shu, L.; Kasami, T.

    1985-01-01

    A cascade coding scheme for error control is investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are evaluated. They seem to be quite suitable for satellite down-link error control.

  5. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Lin, S.

    1985-01-01

    A cascaded coding scheme for error control was investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are studied which seem to be quite suitable for satellite down-link error control.

  6. The STAGS computer code

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Brogan, F. A.

    1978-01-01

    Basic information about the computer code STAGS (Structural Analysis of General Shells) is presented to describe to potential users the scope of the code and the solution procedures that are incorporated. Primarily, STAGS is intended for analysis of shell structures, although it has been extended to more complex shell configurations through the inclusion of springs and beam elements. The formulation is based on a variational approach in combination with local two dimensional power series representations of the displacement components. The computer code includes options for analysis of linear or nonlinear static stress, stability, vibrations, and transient response. Material as well as geometric nonlinearities are included. A few examples of applications of the code are presented for further illustration of its scope.

  7. Compressible Astrophysics Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  8. New quantum MDS codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Wang, Liqi; Zhu, Shixin

    2015-03-01

    Quantum maximum-distance-separable (MDS) codes form an important class of quantum codes. It is very hard to construct quantum MDS codes with relatively large minimum distance. In this paper, based on classical constacyclic codes, we construct two classes of quantum MDS codes with parameters where , and with even, and where , and with odd. The quantum MDS codes exhibited here have parameters better than the ones available in the literature.

  9. Cracking the survival code

    PubMed Central

    Füllgrabe, Jens; Heldring, Nina; Hermanson, Ola; Joseph, Bertrand

    2014-01-01

    Modifications of histones, the chief protein components of the chromatin, have emerged as critical regulators of life and death. While the “apoptotic histone code” came to light a few years ago, accumulating evidence indicates that autophagy, a cell survival pathway, is also heavily regulated by histone-modifying proteins. In this review we describe the emerging “autophagic histone code” and the role of histone modifications in the cellular life vs. death decision. PMID:24429873

  10. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  11. Approximately Independent Features of Languages

    NASA Astrophysics Data System (ADS)

    Holman, Eric W.

    To facilitate the testing of models for the evolution of languages, the present paper offers a set of linguistic features that are approximately independent of each other. To find these features, the adjusted Rand index (R?) is used to estimate the degree of pairwise relationship among 130 linguistic features in a large published database. Many of the R? values prove to be near zero, as predicted for independent features, and a subset of 47 features is found with an average R? of -0.0001. These 47 features are recommended for use in statistical tests that require independent units of analysis.

  12. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, Stefan K.

    1998-01-01

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei.

  13. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, S.K.

    1998-03-24

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

  14. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  15. Scalable motion vector coding

    NASA Astrophysics Data System (ADS)

    Barbarien, Joeri; Munteanu, Adrian; Verdicchio, Fabio; Andreopoulos, Yiannis; Cornelis, Jan P.; Schelkens, Peter

    2004-11-01

    Modern video coding applications require transmission of video data over variable-bandwidth channels to a variety of terminals with different screen resolutions and available computational power. Scalable video coding is needed to optimally support these applications. Recently proposed wavelet-based video codecs employing spatial domain motion compensated temporal filtering (SDMCTF) provide quality, resolution and frame-rate scalability while delivering compression performance comparable to that of the state-of-the-art non-scalable H.264-codec. These codecs require scalable coding of the motion vectors in order to support a large range of bit-rates with optimal compression efficiency. Scalable motion vector coding algorithms based on the integer wavelet transform followed by embedded coding of the wavelet coefficients were recently proposed. In this paper, a new and fundamentally different scalable motion vector codec (MVC) using median-based motion vector prediction is proposed. Extensive experimental results demonstrate that the proposed MVC systematically outperforms the wavelet-based state-of-the-art solutions. To be able to take advantage of the proposed scalable MVC, a rate allocation mechanism capable of optimally dividing the available rate among texture and motion information is required. Two rate allocation strategies are proposed and compared. The proposed MVC and rate allocation schemes are incorporated into an SDMCTF-based video codec and the benefits of scalable motion vector coding are experimentally demonstrated.

  16. Scale independence of décollement thrusting

    USGS Publications Warehouse

    McBride, John H.; Pugin, Andre J.M.; Hatcher, Robert D., Jr.

    2007-01-01

    Orogen-scale décollements (detachment surfaces) are an enduring subject of investigation by geoscientists. Uncertainties remain as to how crustal convergence processes maintain the stresses necessary for development of low-angle fault surfaces above which huge slabs of rock are transported horizontally for tens to hundreds of kilometers. Seismic reflection profiles from the southern Appalachian crystalline core and several foreland fold-and-thrust belts provide useful comparisons with high-resolution shallow-penetration seismic reflection profiles acquired over the frontal zone of the Michigan lobe of the Wisconsinan ice sheet northwest of Chicago, Illinois. These profiles provide images of subhorizontal and overlapping dipping reflections that reveal a ramp-and-flat thrust system developed in poorly consolidated glacial till. The system is rooted in a master décollement at the top of bedrock. These 2–3 km long images contain analogs of images observed in seismic reflection profiles from orogenic belts, except that the scale of observation in the profiles in glacial materials is two orders of magnitude less. Whereas the décollement beneath the ice lobe thrust belt lies ?70 m below thrusted anticlines having wavelengths of tens of meters driven by an advancing ice sheet, seismic images from overthrust terranes are related to lithospheric convergence that produces décollements traceable for thousands of kilometers at depths ranging from a few to over 10 km. Dual vergence or reversals in vergence (retrocharriage) that developed over abrupt changes in depth to the décollement can be observed at all scales. The strikingly similar images, despite the contrast in scale and driving mechanism, suggest a scale- and driving mechanism–independent behavior for décollement thrust systems. All these systems initially had the mechanical properties needed to produce very similar geometries with a compressional driving mechanism directed subparallel to Earth's surface. Subduction-related accretionary complexes also produce thrust systems with similar geometries in semi- to unconsolidated materials.

  17. Parental Beliefs about Emotions Are Associated with Early Adolescents' Independent and Interdependent Self-Construals

    ERIC Educational Resources Information Center

    Her, Pa; Dunsmore, Julie C.

    2011-01-01

    We assessed linkages between parents' beliefs and their children's self-construals with 60 7th and 8th graders. Early adolescents completed an open-ended, Self-Guide Questionnaire and an independent and interdependent reaction-time measure. The self-guide responses were coded for independent and interdependent traits. Parents reported beliefs…

  18. Space-independent xenon oscillations revisited

    SciTech Connect

    Rizwan-uddin )

    1989-01-01

    Recently, various branches of engineering and science have seen a rapid increase in the number of dynamical analyses undertaken. This modern phenomenon often obscures the fact that such analyses were sometimes carried out even before the current trend began. Moreover, these earlier analyses, which even now seem very ingenuous, were carried out at a time when the available information about dynamical systems was not as well disseminated as it is today. One such analysis, carried out in the early 1960s, showed the existence of stable limit cycles in a simple model for space-independent xenon dynamics in nuclear reactors. The authors, apparently unaware of the now well-known bifurcation theorem by Hopf, could not numerically discover unstable limit cycles, though they did find regions in parameter space where the fixed points are stable for small perturbations but unstable for very large perturbations. The analysis was carried out both analytically and numerically. As a tribute to these early nonlinear dynamicists in the field of nuclear engineering, in this paper, the Hopf theorem and its conclusions are briefly described, and then the solution of the space-independent xenon oscillation problem is presented, which was obtained using the bifurcation analysis BIFDD code. These solutions are presented along with a discussion of the earlier results.

  19. Coded aperture compressive temporal imaging.

    PubMed

    Llull, Patrick; Liao, Xuejun; Yuan, Xin; Yang, Jianbo; Kittle, David; Carin, Lawrence; Sapiro, Guillermo; Brady, David J

    2013-05-01

    We use mechanical translation of a coded aperture for code division multiple access compression of video. We discuss the compressed video's temporal resolution and present experimental results for reconstructions of > 10 frames of temporal data per coded snapshot. PMID:23669910

  20. Independent component analysis: recent advances

    PubMed Central

    Hyvärinen, Aapo

    2013-01-01

    Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of non-Gaussianity, which enables the identification of original, underlying components, in contrast to classical methods. The basic theory of independent component analysis was mainly developed in the 1990s and summarized, for example, in our monograph in 2001. Here, we provide an overview of some recent developments in the theory since the year 2000. The main topics are: analysis of causal relations, testing independent components, analysing multiple datasets (three-way data), modelling dependencies between the components and improved methods for estimating the basic model. PMID:23277597

  1. Technology for Independent Living: Sourcebook.

    ERIC Educational Resources Information Center

    Enders, Alexandra, Ed.

    This sourcebook provides information for the practical implementation of independent living technology in the everyday rehabilitation process. "Information Services and Resources" lists databases, clearinghouses, networks, research and development programs, toll-free telephone numbers, consumer protection caveats, selected publications, and…

  2. Independent Schools: Landscape and Learnings.

    ERIC Educational Resources Information Center

    Oates, William A.

    1981-01-01

    Examines American independent schools (parochial, southern segregated, and private institutions) in terms of their funding, expenditures, changing enrollment patterns, teacher-student ratios, and societal functions. Journal available from Daedalus Subscription Department, 1172 Commonwealth Ave., Boston, MA 02132. (AM)

  3. Reversibility and efficiency in coding protein information.

    PubMed

    Tamir, Boaz; Priel, Avner

    2010-12-21

    Why the genetic code has a fixed length? Protein information is transferred by coding each amino acid using codons whose length equals 3 for all amino acids. Hence the most probable and the least probable amino acid get a codeword with an equal length. Moreover, the distributions of amino acids found in nature are not uniform and therefore the efficiency of such codes is sub-optimal. The origins of these apparently non-efficient codes are yet unclear. In this paper we propose an a priori argument for the energy efficiency of such codes resulting from their reversibility, in contrast to their time inefficiency. Such codes are reversible in the sense that a primitive processor, reading three letters in each step, can always reverse its operation, undoing its process. We examine the codes for the distributions of amino acids that exist in nature and show that they could not be both time efficient and reversible. We investigate a family of Zipf-type distributions and present their efficient (non-fixed length) prefix code, their graphs, and the condition for their reversibility. We prove that for a large family of such distributions, if the code is time efficient, it could not be reversible. In other words, if pre-biotic processes demand reversibility, the protein code could not be time efficient. The benefits of reversibility are clear: reversible processes are adiabatic, namely, they dissipate a very small amount of energy. Such processes must be done slowly enough; therefore time efficiency is non-important. It is reasonable to assume that early biochemical complexes were more prone towards energy efficiency, where forward and backward processes were almost symmetrical. PMID:20868696

  4. Spherical hashing: binary code embedding with hyperspheres.

    PubMed

    Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui

    2015-11-01

    Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement. PMID:26440269

  5. Code Seal v 1.0

    Energy Science and Technology Software Center (ESTSC)

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is anmore » active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation« less

  6. Effective Practice in the Design of Directed Independent Learning Opportunities

    ERIC Educational Resources Information Center

    Thomas, Liz; Jones, Robert; Ottaway, James

    2015-01-01

    This study, commissioned by the HEA and the QAA focuses on directed independent learning practices in UK higher education. It investigates what stakeholders (including academic staff and students) have found to be the most effective practices in the inception, design, quality assurance and enhancement of directed independent learning and explores…

  7. Effective Practice in the Design of Directed Independent Learning Opportunities

    ERIC Educational Resources Information Center

    Thomas, Liz; Jones, Robert; Ottaway, James

    2015-01-01

    This study, commissioned by the HEA and the QAA focuses on directed independent learning practices in UK higher education. It investigates what stakeholders (including academic staff and students) have found to be the most effective practices in the inception, design, quality assurance and enhancement of directed independent learning and explores…

  8. Codes with multi-level error-correcting capabilities

    NASA Technical Reports Server (NTRS)

    Lin, Mao-Chao; Lin, Shu

    1990-01-01

    In conventional channel coding, all the information symbols of a message are regarded equally significant, and hence codes are devised to provide equal protection for each information symbol against channel errors. However, in some circumstances, some information symbols in a message are more significant than the other symbols. As a result, it is desirable to devise codes with multilevel error-correcting capabilities. In this paper, block codes with multilevel error correcting capabilities, which are also known as unequal error protection (UEP) codes, are investigated. Several classes of UEP codes are constructed. One class of codes satisfies the Hamming bound on the number of parity-check symbols for systematic linear UEP codes and hence is optimal.

  9. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing, a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study, which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating, and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these sub-trellises.

  10. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to implement the system at high speed. Second, we will describe details of the 8-trellis diagram we found to best meet the trade-offs between chip and overall system complexity. The chosen approach implements the trellis for the (64, 40, 8) RM subcode with 32 independent sub-trellises. And third, we will describe results of our feasibility study on the implementation of such an IC chip in CMOS technology to implement one of these sub-trellises.

  11. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode" but also "what to encode" to achieve UEP. Another advantage of the priority encoding process is that the majority of high-priority data can be decoded sooner since only a small number of code symbols are required to reconstruct high-priority data. This approach increases the likelihood that high-priority data is decoded first over low-priority data. The Prioritized LT code scheme achieves an improvement in high-priority data decoding performance as well as overall information recovery without penalizing the decoding of low-priority data, assuming high-priority data is no more than half of a message block. The cost is in the additional complexity required in the encoder. If extra computation resource is available at the transmitter, image, voice, and video transmission quality in terrestrial and space communications can benefit from accurate use of redundancy in protecting data with varying priorities.

  12. Photonic security system using spatial codes and remote coded coherent optical communications

    NASA Astrophysics Data System (ADS)

    Riza, Nabeel A.; Howlader, Mohammad M.; Madamopoulos, Nicholas

    1996-09-01

    A novel photonic security system is describe using 2D spatial codes based on both optical phase and amplitude information. This security system consists of an optical interferometric encoding subsystem that rapidly reads and encodes the 2D complex-valued spatial code, forming a wideband frequency modulated optical beam and a colinear optical reference beam. After appropriate coherence coding of this beam pair, the light is launched into a low probability of intercept communication channel such as an optical fiber or a narrow beamwidth free-space optical ink. At the remote code receiving and data processing site, the received light beam pair is first coherently decoded. Then, high speed photodetector via optical heterodyne detection generates an encoded wideband radio frequency signal that contains the original 2D code. Decoding is implemented in parallel via two independent systems. One decode uses a Fourier transforming lens to reconstruct an electronic image interferogram of the complex-valued user code. This image interferogram is sent to a high speed electronic image processor for verification purposes. The other decoder is a high speed coherent acousto-optic time integrating correlator that optically determines match-mismatch between the received encoded signal and the code signal generated by the electronic database. Improved security to the overall communication network is added by using various keycodes such as a time varying keycode that determines the exact spatial beam scanning sequence required for both proper encoding and decoding of the 2D code information. This paper describes preliminary experiments using a simple 1D amplitude modulated spatial code.

  13. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  14. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  15. Identifying personal microbiomes using metagenomic codes.

    PubMed

    Franzosa, Eric A; Huang, Katherine; Meadow, James F; Gevers, Dirk; Lemon, Katherine P; Bohannan, Brendan J M; Huttenhower, Curtis

    2015-06-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30-300 d later, ?30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability-a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  16. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  17. Scalable L-infinite coding of meshes.

    PubMed

    Munteanu, Adrian; Cernea, Dan C; Alecu, Alin; Cornelis, Jan; Schelkens, Peter

    2010-01-01

    The paper investigates the novel concept of local-error control in mesh geometry encoding. In contrast to traditional mesh-coding systems that use the mean-square error as target distortion metric, this paper proposes a new L-infinite mesh-coding approach, for which the target distortion metric is the L-infinite distortion. In this context, a novel wavelet-based L-infinite-constrained coding approach for meshes is proposed, which ensures that the maximum error between the vertex positions in the original and decoded meshes is lower than a given upper bound. Furthermore, the proposed system achieves scalability in L-infinite sense, that is, any decoding of the input stream will correspond to a perfectly predictable L-infinite distortion upper bound. An instantiation of the proposed L-infinite-coding approach is demonstrated for MESHGRID, which is a scalable 3D object encoding system, part of MPEG-4 AFX. In this context, the advantages of scalable L-infinite coding over L-2-oriented coding are experimentally demonstrated. One concludes that the proposed L-infinite mesh-coding approach guarantees an upper bound on the local error in the decoded mesh, it enables a fast real-time implementation of the rate allocation, and it preserves all the scalability features and animation capabilities of the employed scalable mesh codec. PMID:20224144

  18. Phase-coded pulse aperiodic transmitter coding

    NASA Astrophysics Data System (ADS)

    Virtanen, I. I.; Vierinen, J.; Lehtinen, M. S.

    2009-07-01

    Both ionospheric and weather radar communities have already adopted the method of transmitting radar pulses in an aperiodic manner when measuring moderately overspread targets. Among the users of the ionospheric radars, this method is called Aperiodic Transmitter Coding (ATC), whereas the weather radar users have adopted the term Simultaneous Multiple Pulse-Repetition Frequency (SMPRF). When probing the ionosphere at the carrier frequencies of the EISCAT Incoherent Scatter Radar facilities, the range extent of the detectable target is typically of the order of one thousand kilometers - about seven milliseconds - whereas the characteristic correlation time of the scattered signal varies from a few milliseconds in the D-region to only tens of microseconds in the F-region. If one is interested in estimating the scattering autocorrelation function (ACF) at time lags shorter than the F-region correlation time, the D-region must be considered as a moderately overspread target, whereas the F-region is a severely overspread one. Given the technical restrictions of the radar hardware, a combination of ATC and phase-coded long pulses is advantageous for this kind of target. We evaluate such an experiment under infinitely low signal-to-noise ratio (SNR) conditions using lag profile inversion. In addition, a qualitative evaluation under high-SNR conditions is performed by analysing simulated data. The results show that an acceptable estimation accuracy and a very good lag resolution in the D-region can be achieved with a pulse length long enough for simultaneous E- and F-region measurements with a reasonable lag extent. The new experiment design is tested with the EISCAT Tromsø VHF (224 MHz) radar. An example of a full D/E/F-region ACF from the test run is shown at the end of the paper.

  19. New Optimal Asymmetric Quantum Codes Derived from Negacyclic Codes

    NASA Astrophysics Data System (ADS)

    Chen, Jian-Zhang; Li, Jian-Ping; Lin, Jie

    2014-01-01

    The construction of quantum maximum-distance-separable (MDS) codes have been studied by many researchers for many years. Here, by using negacyclic codes, we construct two families of asymmetric quantum codes. The first family is the asymmetric quantum codes with parameters , where 0? t? k?( q-1)/2, , and k, t are positive integers. The second one is the asymmetric quantum codes with parameters , where 1? t? k?( q-1)/2, and k, t are positive integers. Moreover, the constructed asymmetric quantum codes are optimal and different from the codes available in the literature.

  20. FAA Smoke Transport Code

    Energy Science and Technology Software Center (ESTSC)

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a codemore » obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.« less

  1. Adaptation and visual coding

    PubMed Central

    Webster, Michael A.

    2011-01-01

    Visual coding is a highly dynamic process and continuously adapting to the current viewing context. The perceptual changes that result from adaptation to recently viewed stimuli remain a powerful and popular tool for analyzing sensory mechanisms and plasticity. Over the last decade, the footprints of this adaptation have been tracked to both higher and lower levels of the visual pathway and over a wider range of timescales, revealing that visual processing is much more adaptable than previously thought. This work has also revealed that the pattern of aftereffects is similar across many stimulus dimensions, pointing to common coding principles in which adaptation plays a central role. However, why visual coding adapts has yet to be fully answered. PMID:21602298

  2. Modulation and coding

    NASA Astrophysics Data System (ADS)

    Farrell, P. G.; Clark, A. P.

    1984-12-01

    Analog FDM/FM and digital TDM/PCM/PSK modulation schemes are compared. In both single access and multiple access cases, FDM/FM is superior, thereby lessening interest in the use of a transmultiplexer to convert existing FDM voice traffic. In SCPC systems, though FM has been employed, digital modulation schemes appear to offer considerable advantages due to their ability to use both bandwidth compression techniques and error detection and correction (EDC) coding. QPSK has been widely used, and its performance compares well with other digital carrier schemes when suitably band-limited. The use of EDC coding in satellites effectively compensates for low transmitter powers, so that coding gains in the 2-7 range are envisioned. TDMA can efficiently combine a wide range of different traffics and services. The ultimate degree of voice, data and video integration is noted to be obtainable from a packet-switched satellite system.

  3. SAC: Sheffield Advanced Code

    NASA Astrophysics Data System (ADS)

    Griffiths, Mike; Fedun, Viktor; Mumford, Stuart; Gent, Frederick

    2013-06-01

    The Sheffield Advanced Code (SAC) is a fully non-linear MHD code designed for simulations of linear and non-linear wave propagation in gravitationally strongly stratified magnetized plasma. It was developed primarily for the forward modelling of helioseismological processes and for the coupling processes in the solar interior, photosphere, and corona; it is built on the well-known VAC platform that allows robust simulation of the macroscopic processes in gravitationally stratified (non-)magnetized plasmas. The code has no limitations of simulation length in time imposed by complications originating from the upper boundary, nor does it require implementation of special procedures to treat the upper boundaries. SAC inherited its modular structure from VAC, thereby allowing modification to easily add new physics.

  4. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option will produce the most effective results.

  5. Coding capacity of complementary DNA strands.

    PubMed Central

    Casino, A; Cipollaro, M; Guerrini, A M; Mastrocinque, G; Spena, A; Scarlato, V

    1981-01-01

    A Fortran computer algorithm has been used to analyze the nucleotide sequence of several structural genes. The analysis performed on both coding and complementary DNA strands shows that whereas open reading frames shorter than 100 codons are randomly distributed on both DNA strands, open reading frames longer than 100 codons ("virtual genes") are significantly more frequent on the complementary DNA strand than on the coding one. These "virtual genes" were further investigated by looking at intron sequences, splicing points, signal sequences and by analyzing gene mutations. On the basis of this analysis coding and complementary DNA strands of several eukaryotic structural genes cannot be distinguished. In particular we suggest that the complementary DNA strand of the human epsilon-globin gene might indeed code for a protein. PMID:7015290

  6. Numerical MHD codes for modeling astrophysical flows

    NASA Astrophysics Data System (ADS)

    Koldoba, A. V.; Ustyugova, G. V.; Lii, P. S.; Comins, M. L.; Dyda, S.; Romanova, M. M.; Lovelace, R. V. E.

    2016-05-01

    We describe a Godunov-type magnetohydrodynamic (MHD) code based on the Miyoshi and Kusano (2005) solver which can be used to solve various astrophysical hydrodynamic and MHD problems. The energy equation is in the form of entropy conservation. The code has been implemented on several different coordinate systems: 2.5D axisymmetric cylindrical coordinates, 2D Cartesian coordinates, 2D plane polar coordinates, and fully 3D cylindrical coordinates. Viscosity and diffusivity are implemented in the code to control the accretion rate in the disk and the rate of penetration of the disk matter through the magnetic field lines. The code has been utilized for the numerical investigations of a number of different astrophysical problems, several examples of which are shown.

  7. Perceptually-Based Adaptive JPEG Coding

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

  8. New opportunities seen for independents

    SciTech Connect

    Adams, G.A. )

    1990-10-22

    The collapse of gas and oil prices in the mid-1980s significantly reduced the number of independent exploration companies. At the same time, a fundamental shift occurred among major oil companies as they allocated their exploration budgets toward international operations and made major production purchases. Several large independents also embraced a philosophy of budget supplementation through joint venture partnership arrangements. This has created a unique and unusual window of opportunity for the smaller independents (defined for this article as exploration and production companies with a market value of less than $1 billion) to access the extensive and high quality domestic prospect inventories of the major and large independent oil and gas companies and to participate in the search for large reserve targets on attractive joint venture terms. Participation in these types of joint ventures, in conjunction with internally generated plays selected through the use of today's advanced technology (computer-enhanced, high-resolution seismic; horizontal drilling; etc.) and increasing process for oil and natural gas, presents the domestic exploration-oriented independent with an attractive money-making opportunity for the 1990s.

  9. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor-noise correlation model was developed from engine acoustic test results. This work provided several insights on potential approaches to reducing aircraft engine noise. Code development is described in this report, and those insights are discussed.

  10. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  11. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  12. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  13. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  14. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  15. Refractoriness Enhances Temporal Coding by Auditory Nerve Fibers

    PubMed Central

    Avissar, Michael; Wittig, John H.; Saunders, James C.

    2013-01-01

    A universal property of spiking neurons is refractoriness, a transient decrease in discharge probability immediately following an action potential (spike). The refractory period lasts only one to a few milliseconds, but has the potential to affect temporal coding of acoustic stimuli by auditory neurons, which are capable of submillisecond spike-time precision. Here this possibility was investigated systematically by recording spike times from chicken auditory nerve fibers in vivo while stimulating with repeated pure tones at characteristic frequency. Refractory periods were tightly distributed, with a mean of 1.58 ms. A statistical model was developed to recapitulate each fiber's responses and then used to predict the effect of removing the refractory period on a cell-by-cell basis for two largely independent facets of temporal coding: faithful entrainment of interspike intervals to the stimulus frequency and precise synchronization of spike times to the stimulus phase. The ratio of the refractory period to the stimulus period predicted the impact of refractoriness on entrainment and synchronization. For ratios less than ?0.9, refractoriness enhanced entrainment and this enhancement was often accompanied by an increase in spike-time precision. At higher ratios, little or no change in entrainment or synchronization was observed. Given the tight distribution of refractory periods, the ability of refractoriness to improve temporal coding is restricted to neurons responding to low-frequency stimuli. Enhanced encoding of low frequencies likely affects sound localization and pitch perception in the auditory system, as well as perception in nonauditory sensory modalities, because all spiking neurons exhibit refractoriness. PMID:23637161

  16. Grid-free tree-code simulations of the plasma-material interaction region

    NASA Astrophysics Data System (ADS)

    Salmagne, C.; Reiter, D.; Gibbon, P.

    2014-11-01

    A fully kinetic grid-free model based on a Barnes-Hut tree code is used to selfconsistently simulate a collisionless plasma bounded by two floating walls. The workhorse for simulating such plasma wall transition layers is currently the PIC method. However, the present grid-free formulation provides a powerful independent tool to test it [1] and to possibly extend particle simulations towards collisional regimes in a more internally consistent way. Here, we use the grid-free massively parallel Barnes-Hut tree-code PEPC - a well established tool for simulations of Laser-plasmas and astrophysical applications - to develop a 3D ab initio plasma target interaction model. With our approach an electrostatic sheath naturally builds up within the first couple of Debye lengths close to the wall rather than being imposed as a prescribed boundary condition. We verified the code using analytic results [2] as well as 1D PIC simulations [3]. The model was then used to investigate the influence of inclined magnetic fields on the plasma material interface. We used the code to study the correlation between the magnetic field angle and the angular distribution of incident particles.

  17. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    PubMed

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding. PMID:26416439

  18. Channel-independent and sensor-independent stimulus representations

    NASA Astrophysics Data System (ADS)

    Levin, David N.

    2005-11-01

    This paper shows how a machine, which observes stimuli through an uncharacterized, uncalibrated channel and sensor, can glean machine-independent information (i.e., channel- and sensor-independent information) about the stimuli. This is possible if the following two conditions are satisfied by the observed stimulus and by the observing device, respectively: (1) the stimulus' trajectory in the space of all possible configurations has a well-defined local velocity covariance matrix; (2) the observing device's sensor state is invertibly related to the stimulus state. The first condition guarantees that the statistical properties of the stimulus time series endow the stimulus configuration space with a differential geometric structure (a metric and parallel transfer procedure), which can then be used to represent relative stimulus configurations in a coordinate-system-independent manner. This requirement is satisfied by a large variety of physical systems, and, in general, it is expected to be satisfied by stimulus trajecteries that densely cover stimulus state space and that have velocity distributions varying smoothly across that space. The second condition implies that the machine defines a specific coordinate system on the stimulus state space, with the nature of that coordinate system depending on the machine's channels and detectors. Thus, machines with different channels and sensors "see" the same stimulus trajectory through state space, but in different machine-specific coordinate systems. It is shown that this requirement is almost certainly satisfied by any device that measures more than 2d independent properties of the stimulus, where d is the number of stimulus degrees of freedom. Taken together, the two conditions guarantee that the observing device can record the stimulus time series in its machine-specific coordinate system and then derive coordinate-system-independent (and, therefore, machine-independent) representations of relative stimulus configurations. The resulting description is an "inner" property of the stimulus time series in the sense that it does not depend on extrinsic factors such as the observer's choice of a coordinate system in which the stimulus is viewed (i.e., the observer's choice of channels and sensors). In other words, the resulting description is an intrinsic property of the evolution of the "real" stimulus that is "out there" broadcasting energy to the observer. This methodology is illustrated with analytic examples and with a numerically simulated experiment. In an intelligent sensory device, this kind of representation "engine" could function as a "front end" that passes channel- and sensor-independent stimulus representations to a pattern recognition module. After a pattern recognizer has been trained in one of these devices, it could be used without a change in other devices having different channels and sensors.

  19. Deterministic and unambiguous dense coding

    SciTech Connect

    Wu Shengjun; Cohen, Scott M.; Sun Yuqing; Griffiths, Robert B.

    2006-04-15

    Optimal dense coding using a partially-entangled pure state of Schmidt rank D and a noiseless quantum channel of dimension D is studied both in the deterministic case where at most L{sub d} messages can be transmitted with perfect fidelity, and in the unambiguous case where when the protocol succeeds (probability {tau}{sub x}) Bob knows for sure that Alice sent message x, and when it fails (probability 1-{tau}{sub x}) he knows it has failed. Alice is allowed any single-shot (one use) encoding procedure, and Bob any single-shot measurement. For D{<=}D a bound is obtained for L{sub d} in terms of the largest Schmidt coefficient of the entangled state, and is compared with published results by Mozes et al. [Phys. Rev. A71, 012311 (2005)]. For D>D it is shown that L{sub d} is strictly less than D{sup 2} unless D is an integer multiple of D, in which case uniform (maximal) entanglement is not needed to achieve the optimal protocol. The unambiguous case is studied for D{<=}D, assuming {tau}{sub x}>0 for a set of DD messages, and a bound is obtained for the average <1/{tau}>. A bound on the average <{tau}> requires an additional assumption of encoding by isometries (unitaries when D=D) that are orthogonal for different messages. Both bounds are saturated when {tau}{sub x} is a constant independent of x, by a protocol based on one-shot entanglement concentration. For D>D it is shown that (at least) D{sup 2} messages can be sent unambiguously. Whether unitary (isometric) encoding suffices for optimal protocols remains a major unanswered question, both for our work and for previous studies of dense coding using partially-entangled states, including noisy (mixed) states.

  20. The Sign Rule and Beyond: Boundary Effects, Flexibility, and Noise Correlations in Neural Population Codes

    PubMed Central

    Hu, Yu; Zylberberg, Joel; Shea-Brown, Eric

    2014-01-01

    Over repeat presentations of the same stimulus, sensory neurons show variable responses. This “noise” is typically correlated between pairs of cells, and a question with rich history in neuroscience is how these noise correlations impact the population's ability to encode the stimulus. Here, we consider a very general setting for population coding, investigating how information varies as a function of noise correlations, with all other aspects of the problem – neural tuning curves, etc. – held fixed. This work yields unifying insights into the role of noise correlations. These are summarized in the form of theorems, and illustrated with numerical examples involving neurons with diverse tuning curves. Our main contributions are as follows. (1) We generalize previous results to prove a sign rule (SR) — if noise correlations between pairs of neurons have opposite signs vs. their signal correlations, then coding performance will improve compared to the independent case. This holds for three different metrics of coding performance, and for arbitrary tuning curves and levels of heterogeneity. This generality is true for our other results as well. (2) As also pointed out in the literature, the SR does not provide a necessary condition for good coding. We show that a diverse set of correlation structures can improve coding. Many of these violate the SR, as do experimentally observed correlations. There is structure to this diversity: we prove that the optimal correlation structures must lie on boundaries of the possible set of noise correlations. (3) We provide a novel set of necessary and sufficient conditions, under which the coding performance (in the presence of noise) will be as good as it would be if there were no noise present at all. PMID:24586128

  1. International efforts in independent living.

    PubMed

    Tate, D G; Jarvis, R L; Juhr, G D

    1979-10-01

    Several nations currently share a major concern for provision of services to severely disabled persons. The focus of this concern has been upon those services which promote maximum integration and independence of severely disabled individuals in the community. Recent efforts in this direction by governments, private citizens, and handicapped-consumer organizations in Sweden, The Netherlands, Denmark, England, Canada, and Australia are discussed. Developments in housing and environmental modification, transportation, and self-help training are reviewed for their potential interest to advocates and practitioners of rehabilitation for independent living. PMID:496600

  2. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  3. Regressive evolution of an eye pigment gene in independently evolved eyeless subterranean diving beetles.

    PubMed

    Leys, Remko; Cooper, Steven J B; Strecker, Ulrike; Wilkens, Horst

    2005-12-22

    Regressive evolution, the reduction or total loss of non-functional characters, is a fairly common evolutionary phenomenon in subterranean taxa. However, the genetic basis of regressive evolution is not well understood. Here we investigate the molecular evolution of the eye pigment gene cinnabar in several independently evolved lineages of subterranean water beetles using maximum likelihood analyses. We found that in eyeless lineages cinnabar has an increased rate of sequence evolution, as well as mutations leading to frame shifts and stop codons, indicative of pseudogenes. These results are consistent with the hypothesis that regressive evolution of eyes proceeds by random mutations, in the absence of selection, that ultimately lead to the loss of gene function in protein-coding genes specific to the eye pathway. PMID:17148242

  4. Environmental Fluid Dynamics Code

    EPA Science Inventory

    The Environmental Fluid Dynamics Code (EFDC)is a state-of-the-art hydrodynamic model that can be used to simulate aquatic systems in one, two, and three dimensions. It has evolved over the past two decades to become one of the most widely used and technically defensible hydrodyn...

  5. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  6. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  7. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  8. Electrical Circuit Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  9. Pharynx Surgery Codes

    Cancer.gov

    Pharynx Tonsil C090–C099, Oropharynx C100–C109, Nasopharynx C110–C119 Pyriform Sinus C129, Hypopharynx C130–C139, Pharynx C140 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery

  10. Colon Surgery Codes

    Cancer.gov

    Colon C180–C 189 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item

  11. Lung Surgery Codes

    Cancer.gov

    Lung C340–C349 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 19 Local tumor destruction or excision, NOS Unknown whether a specimen was

  12. Kidney Surgery Codes

    Cancer.gov

    Kidney, Renal Pelvis, and Ureter Kidney C649, Renal Pelvis C659, Ureter C669 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor

  13. Colon Surgery Codes

    Cancer.gov

    C olon C180–C189 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item

  14. Bladder Surgery Codes

    Cancer.gov

    Bladder C670–C679 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12

  15. Rectum Surgery Codes

    Cancer.gov

    Rectum C209 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item #1294)

  16. Rectosigmoid Surgery Codes

    Cancer.gov

    Rect osigmoid C199 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Code removal/surgical ablation of single or multiple liver metastases under the data item Surgical Procedure/Other Site (NAACCR Item

  17. Stomach Surgery Codes

    Cancer.gov

    Stomach C160–C169 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12 Electrocautery;

  18. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  19. Bladder Coding Guidelines

    Cancer.gov

    Coding Guidelines BLADDER C670–C679 Primary Site C670 Trigone of bladder Base of bladder Floor Below interureteric ridge (interureteric crest, or interureteric fold) C671 D ome of bladder Vertex Roof Vault C672 L ateral wall of bladder Right

  20. Larynx Surgery Codes

    Cancer.gov

    Larynx C320–C329 (Except for M9727, 9733, 9741-9742, 9764-9809, 9832, 9840-9931, 9945-9946, 9950-9967, 9975-9992) Codes 00 None; no surgery of primary site; autopsy ONLY 10 Local tumor destruction, NOS 11 Photodynamic therapy (PDT) 12 Electrocautery;

  1. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  2. Code of Ethics.

    ERIC Educational Resources Information Center

    Association of College Unions-International, Bloomington, IN.

    The code of ethics for the college union and student activities professional is presented by the Association of College Unions-International. The preamble identifies the objectives of the college union as providing campus community centers and social programs that enhance the quality of life for members of the academic community. Ethics for…

  3. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  4. Student Dress Codes.

    ERIC Educational Resources Information Center

    Uerling, Donald F.

    School officials see a need for regulations that prohibit disruptive and inappropriate forms of expression and attire; students see these regulations as unwanted restrictions on their freedom. This paper reviews court litigation involving constitutional limitations on school authority, dress and hair codes, state law constraints, and school…

  5. Dress Codes. Legal Brief.

    ERIC Educational Resources Information Center

    Zirkel, Perry A.

    2000-01-01

    As illustrated by two recent decisions, the courts in the past decade have demarcated wide boundaries for school officials considering dress codes, whether in the form of selective prohibitions or required uniforms. Administrators must warn the community, provide legitimate justification and reasonable clarity, and comply with state law. (MLH)

  6. Coarse coding and discourse comprehension in adults with right hemisphere brain damage

    PubMed Central

    Tompkins, Connie A.; Scharp, Victoria L.; Meigh, Kimberly M.; Fassbinder, Wiltrud

    2009-01-01

    Background Various investigators suggest that some discourse-level comprehension difficulties in adults with right hemisphere brain damage (RHD) have a lexical-semantic basis. As words are processed, the intact right hemisphere arouses and sustains activation of a wide-ranging network of secondary or peripheral meanings and features—a phenomenon dubbed “coarse coding”. Coarse coding impairment has been postulated to underpin some prototypical RHD comprehension deficits, such as difficulties with nonliteral language interpretation, discourse integration, some kinds of inference generation, and recovery when a reinterpretation is needed. To date, however, no studies have addressed the hypothesised link between coarse coding deficit and discourse comprehension in RHD. Aims The current investigation examined whether coarse coding was related to performance on two measures of narrative comprehension in adults with RHD. Methods & Procedures Participants were 32 adults with unilateral RHD from cerebrovascular accident, and 38 adults without brain damage. Coarse coding was operationalised as poor activation of peripheral/weakly related semantic features of words. For the coarse coding assessment, participants listened to spoken sentences that ended in a concrete noun. Each sentence was followed by a spoken target phoneme string. Targets were subordinate semantic features of the sentence-final nouns that were incompatible with their dominant mental representations (e.g., “rotten” for apple). Targets were presented at two post-noun intervals. A lexical decision task was used to gauge both early activation and maintenance of activation of these weakly related semantic features. One of the narrative tasks assessed comprehension of implied main ideas and details, while the other indexed high-level inferencing and integration. Both comprehension tasks were presented auditorily. For all tasks, accuracy of performance was the dependent measure. Correlations were computed within the RHD group between both the early and late coarse coding measures and the two discourse measures. Additionally, ANCOVA and independent t-tests were used to compare both early and sustained coarse coding in subgroups of good and poor RHD comprehenders. Outcomes & Results The group with RHD was less accurate than the control group on all measures. The finding of coarse coding impairment (difficulty activating/sustaining activation of a word’s peripheral features) may appear to contradict prior evidence of RHD suppression deficit (prolonged activation for context-inappropriate meanings of words). However, the sentence contexts in this study were unbiased and thus did not provide an appropriate test of suppression function. Correlations between coarse coding and the discourse measures were small and nonsignificant. There were no differences in coarse coding between RHD comprehension subgroups on the high-level inferencing task. There was also no distinction in early coarse coding for subgroups based on comprehension of implied main ideas and details. But for these same subgroups, there was a difference in sustained coarse coding. Poorer RHD comprehenders of implied information from discourse were also poorer at maintaining activation for semantically distant features of concrete nouns. Conclusions This study provides evidence of a variant of the postulated link between coarse coding and discourse comprehension in RHD. Specifically, adults with RHD who were particularly poor at sustaining activation for peripheral semantic features of nouns were also relatively poor comprehenders of implied information from narratives. PMID:20037670

  7. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  8. Strategic Planning for Independent Schools.

    ERIC Educational Resources Information Center

    Stone, Susan C.

    This manual is intended to serve independent schools beginning strategic planning methods. Chapter 1, "The Case for Strategic Planning," suggests replacing the term "long range planning" with the term "strategic planning," which emphasizes change. The strategic planning and policy development process begins with careful organization to ensure…

  9. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  10. Boston: Cradle of American Independence

    ERIC Educational Resources Information Center

    Community College Journal, 2004

    2004-01-01

    The 2005 American Association of Community Colleges Annual Convention will be held April 6-9 in Boston. While thoroughly modern, the iconic city's identity is firmly rooted in the past. As the cradle of American independence, Boston's long history is an integral part of the American fabric. Adams, Revere, Hancock are more than historical figures;…

  11. Strategic Planning for Independent Schools.

    ERIC Educational Resources Information Center

    Stone, Susan C.

    This manual is intended to serve independent schools beginning strategic planning methods. Chapter 1, "The Case for Strategic Planning," suggests replacing the term "long range planning" with the term "strategic planning," which emphasizes change. The strategic planning and policy development process begins with careful organization to ensure…

  12. Selective Influence through Conditional Independence.

    ERIC Educational Resources Information Center

    Dzhafarov, Ehtibar N.

    2003-01-01

    Presents a generalization and improvement for the definition proposed by E. Dzhafarov (2001) for selectiveness in the dependence of several random variables on several (sets of) external factors. This generalization links the notion of selective influence with that of conditional independence. (SLD)

  13. 10 Questions about Independent Reading

    ERIC Educational Resources Information Center

    Truby, Dana

    2012-01-01

    Teachers know that establishing a robust independent reading program takes more than giving kids a little quiet time after lunch. But how do they set up a program that will maximize their students' gains? Teachers have to know their students' reading levels inside and out, help them find just-right books, and continue to guide them during…

  14. High rate concatenated coding systems using bandwidth efficient trellis inner codes

    NASA Technical Reports Server (NTRS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1989-01-01

    High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.

  15. High rate concatenated coding systems using bandwidth efficient trellis inner codes

    NASA Astrophysics Data System (ADS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1989-05-01

    High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.

  16. High-Fidelity Coding with Correlated Neurons

    PubMed Central

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  17. TACO: a finite element heat transfer code

    SciTech Connect

    Mason, W.E. Jr.

    1980-02-01

    TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code.

  18. MPEG-4 coding of ultrasound sequences

    NASA Astrophysics Data System (ADS)

    Lau, Christopher; Cabral, James E., Jr.; Rambhia, Avni H.; Kim, Yongmin

    2000-04-01

    MPEG-4 is a new standard for compressing and presenting many types of multimedia content, such as video, audio, and synthetic 2D and 3D graphics. New features include support for user interaction and flexible display of multiple video bitstreams. The basis of these new capabilities is object- based video coding, in which a video image is represented as a set of regions of interest, or video objects, that are coded independently. At the decoder, users decode, compose and manipulate video objects from one or more bitstreams in a single display. In this work, we examine the feasibility of using MPEG-4 for coding ultrasound sequences. In preliminary results, the compression performance of MPEG-4 was comparable to H.264 and a bit savings of at least 15 percent was possible when coding static objects as sprites. The flexible compositing capability of MPEG-4 was demonstrated by dividing an ultrasound machine's display into video objects and encoding each video object as a separate bitstream. Video objects form different bitstreams were decoded and composited on a single display using an MPEG-4 decoder to demonstrate side-by-side comparisons of ultrasound scans. Until now, these compositing capabilities were only available using proprietary PACS display systems. Using MPEG-4 to deliver ultrasound allows any MPEG-4- compliant decoder to perform these functions.

  19. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  20. An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development

    NASA Astrophysics Data System (ADS)

    Haldeman, Charles Waldo, IV

    2003-10-01

    This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three-dimensional effects on the LPV surface are significant.

  1. Preliminary Results for Coded Aperture Plasma Diagnostic

    NASA Astrophysics Data System (ADS)

    Haw, Magnus; Bellan, Paul

    2014-10-01

    A 1D coded aperture camera has been developed as a prototype for a high speed, wavelength-independent, plasma imaging diagnostic. Images are obtained via a coded or masked aperture that modulates incoming light to produce an invertible linear transform of the image on a detector. The system requires no lenses or mirrors and can be thought of as a multiplexed pinhole camera (with comparable resolution and greater signal than a single pinhole). The inexpensive custom-built system has a 13 × 1cm field of view, a vertical spatial resolution of 2 mm, and a temporal resolution of 1 ?s. Visible light images of the Caltech MHD-driven jet experiment agree with simultaneous images obtained with a conventional camera. For the simple jet geometry, the system can also extract depth information from single images. Further work will revolve around improving shielding and acquiring X-ray and EUV scintillators for imaging in those wavelengths. Supported by DOE, NSF.

  2. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  3. Quantum error-correcting codes over mixed alphabets

    NASA Astrophysics Data System (ADS)

    Wang, Zhuo; Yu, Sixia; Fan, Heng; Oh, C. H.

    2013-08-01

    We study the quantum error-correcting codes over mixed alphabets to deal with a more complicated and practical situation in which the physical systems for encoding may have different numbers of energy levels. In particular we investigate their constructions and propose the theory of quantum Singleton bound. Two kinds of code constructions are presented: a projection-based construction for general case and a graphical construction based on a graph-theoretical object composite coding clique dealing with the case of reducible alphabets. We find out some optimal one-error correcting or detecting codes over two alphabets. Our method of composite coding clique also sheds light on constructing standard quantum error-correcting codes, and other families of optimal codes are found.

  4. The TESS (Tandem Experiment Simulation Studies) computer code user's manual

    SciTech Connect

    Procassini, R.J. . Dept. of Nuclear Engineering); Cohen, B.I. )

    1990-06-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs.

  5. Experimental Measurement-Device-Independent Entanglement Detection

    NASA Astrophysics Data System (ADS)

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-02-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols.

  6. Experimental measurement-device-independent entanglement detection.

    PubMed

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  7. Experimental Measurement-Device-Independent Entanglement Detection

    PubMed Central

    Nawareg, Mohamed; Muhammad, Sadiq; Amselem, Elias; Bourennane, Mohamed

    2015-01-01

    Entanglement is one of the most puzzling features of quantum theory and of great importance for the new field of quantum information. The determination whether a given state is entangled or not is one of the most challenging open problems of the field. Here we report on the experimental demonstration of measurement-device-independent (MDI) entanglement detection using witness method for general two qubits photon polarization systems. In the MDI settings, there is no requirement to assume perfect implementations or neither to trust the measurement devices. This experimental demonstration can be generalized for the investigation of properties of quantum systems and for the realization of cryptography and communication protocols. PMID:25649664

  8. Group independent component analysis of MR spectra

    PubMed Central

    Kalyanam, Ravi; Boutte, David; Gasparovic, Chuck; Hutchison, Kent E; Calhoun, Vince D

    2013-01-01

    This study investigates the potential of independent component analysis (ICA) to provide a data-driven approach for group level analysis of magnetic resonance (MR) spectra. ICA collectively analyzes data to identify maximally independent components, each of which captures covarying resonances, including those from different metabolic sources. A comparative evaluation of the ICA approach with the more established LCModel method in analyzing two different noise-free, artifact-free, simulated data sets of known compositions is presented. The results from such ideal simulations demonstrate the ability of data-driven ICA to decompose data and accurately extract components resembling modeled basis spectra from both data sets, whereas the LCModel results suffer when the underlying model deviates from assumptions, thus highlighting the sensitivity of model-based approaches to modeling inaccuracies. Analyses with simulated data show that independent component weights are good estimates of concentrations, even of metabolites with low intensity singlet peaks, such as scyllo-inositol. ICA is also applied to single voxel spectra from 193 subjects, without correcting for baseline variations, line-width broadening or noise. The results provide evidence that, despite the presence of confounding artifacts, ICA can be used to analyze in vivo spectra and extract resonances of interest. ICA is a promising technique for decomposing MR spectral data into components resembling metabolite resonances, and therefore has the potential to provide a data-driven alternative to the use of metabolite concentrations derived from curve-fitting individual spectra in making group comparisons. PMID:23785655

  9. Progress in cultivation-independent phyllosphere microbiology

    PubMed Central

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their ‘noncultivability’ and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human–pathogenic bacteria in plant food. PMID:24003903

  10. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operationmore »of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.« less

  11. Wire Transport Code

    SciTech Connect

    Caporaso, G.J.; Cole, A.G.

    1983-03-01

    The Wire Transport Code was developed to study the dynamics of relativistic-electron-beam propagation in the transport tube in which a wire-conditioning zone is present. In order for the beam to propagate successfully in the transport section it must be matched onto the wire by focusing elements. The beam must then be controlled by strong lenses as it exits the wire zone. The wire transport code was developed to model this process in substantial detail. It is able to treat axially symmetric problems as well as those in which the beam is transversely displaced from the axis of the transport tube. The focusing effects of foils and various beamline lenses are included in the calculations.

  12. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  13. On quantum network coding

    NASA Astrophysics Data System (ADS)

    Jain, Avinash; Franceschetti, Massimo; Meyer, David A.

    2011-03-01

    We study the problem of error-free multiple unicast over directed acyclic networks in a quantum setting. We provide a new information-theoretic proof of the known result that network coding does not achieve a larger quantum information flow than what can be achieved by routing for two-pair communication on the butterfly network. We then consider a k-pair multiple unicast problem and for all k ? 2 we show that there exists a family of networks where quantum network coding achieves k-times larger quantum information flow than what can be achieved by routing. Finally, we specify a graph-theoretic sufficient condition for the quantum information flow of any multiple unicast problem to be bounded by the capacity of any sparsest multicut of the network.

  14. Reading a neural code.

    PubMed

    Bialek, W; Rieke, F; de Ruyter van Steveninck, R R; Warland, D

    1991-06-28

    Traditional approaches to neural coding characterize the encoding of known stimuli in average neural responses. Organisms face nearly the opposite task--extracting information about an unknown time-dependent stimulus from short segments of a spike train. Here the neural code was characterized from the point of view of the organism, culminating in algorithms for real-time stimulus estimation based on a single example of the spike train. These methods were applied to an identified movement-sensitive neuron in the fly visual system. Such decoding experiments determined the effective noise level and fault tolerance of neural computation, and the structure of the decoding algorithms suggested a simple model for real-time analog signal processing with spiking neurons. PMID:2063199

  15. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  16. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  17. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  18. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  19. Orthopedics coding and funding.

    PubMed

    Baron, S; Duclos, C; Thoreux, P

    2014-02-01

    The French tarification à l'activité (T2A) prospective payment system is a financial system in which a health-care institution's resources are based on performed activity. Activity is described via the PMSI medical information system (programme de médicalisation du système d'information). The PMSI classifies hospital cases by clinical and economic categories known as diagnosis-related groups (DRG), each with an associated price tag. Coding a hospital case involves giving as realistic a description as possible so as to categorize it in the right DRG and thus ensure appropriate payment. For this, it is essential to understand what determines the pricing of inpatient stay: namely, the code for the surgical procedure, the patient's principal diagnosis (reason for admission), codes for comorbidities (everything that adds to management burden), and the management of the length of inpatient stay. The PMSI is used to analyze the institution's activity and dynamism: change on previous year, relation to target, and comparison with competing institutions based on indicators such as the mean length of stay performance indicator (MLS PI). The T2A system improves overall care efficiency. Quality of care, however, is not presently taken account of in the payment made to the institution, as there are no indicators for this; work needs to be done on this topic. PMID:24461230

  20. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  1. Seismic analysis of piping systems subjected to independent-support excitation

    SciTech Connect

    Subudhi, M.; Bezler, P.

    1983-01-01

    This paper presents a comparison of dynamic responses of piping systems subject to independent-support excitation using the response spectrum and time-history methods. The BNL finite-element computer code PSAFE2 has been used to perform all the analyses. The time-history method combines both the inertia as well as static effect on the piping responses due to independent-support excitations at each time point, thus representing the actual responses. A sample problem is analyzed subjected to two independent support excitations and the results are presented in comparison with the response spectrum methods with uniform or independent-support motion.

  2. Steps to Independent Living Series.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    This set of six activity books and a teacher's guide is designed to help students from eighth grade to adulthood with special needs to learn independent living skills. The activity books have a reading level of 2.5 and address: (1) "How to Get Well When You're Sick or Hurt," including how to take a temperature, see a doctor, and use medicines…

  3. Steps to Independent Living Series.

    ERIC Educational Resources Information Center

    Lobb, Nancy

    This set of six activity books and a teacher's guide is designed to help students from eighth grade to adulthood with special needs to learn independent living skills. The activity books have a reading level of 2.5 and address: (1) "How to Get Well When You're Sick or Hurt," including how to take a temperature, see a doctor, and use medicines…

  4. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  5. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  6. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (?20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  7. The APS SASE FEL : modeling and code comparison.

    SciTech Connect

    Biedron, S. G.

    1999-04-20

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  8. Independent evolution of four heme peroxidase superfamilies

    PubMed Central

    Zámocký, Marcel; Hofbauer, Stefan; Schaffner, Irene; Gasselhuber, Bernhard; Nicolussi, Andrea; Soudi, Monika; Pirker, Katharina F.; Furtmüller, Paul G.; Obinger, Christian

    2015-01-01

    Four heme peroxidase superfamilies (peroxidase–catalase, peroxidase–cyclooxygenase, peroxidase–chlorite dismutase and peroxidase–peroxygenase superfamily) arose independently during evolution, which differ in overall fold, active site architecture and enzymatic activities. The redox cofactor is heme b or posttranslationally modified heme that is ligated by either histidine or cysteine. Heme peroxidases are found in all kingdoms of life and typically catalyze the one- and two-electron oxidation of a myriad of organic and inorganic substrates. In addition to this peroxidatic activity distinct (sub)families show pronounced catalase, cyclooxygenase, chlorite dismutase or peroxygenase activities. Here we describe the phylogeny of these four superfamilies and present the most important sequence signatures and active site architectures. The classification of families is described as well as important turning points in evolution. We show that at least three heme peroxidase superfamilies have ancient prokaryotic roots with several alternative ways of divergent evolution. In later evolutionary steps, they almost always produced highly evolved and specialized clades of peroxidases in eukaryotic kingdoms with a significant portion of such genes involved in coding various fusion proteins with novel physiological functions. PMID:25575902

  9. Independent evolution of four heme peroxidase superfamilies.

    PubMed

    Zámocký, Marcel; Hofbauer, Stefan; Schaffner, Irene; Gasselhuber, Bernhard; Nicolussi, Andrea; Soudi, Monika; Pirker, Katharina F; Furtmüller, Paul G; Obinger, Christian

    2015-05-15

    Four heme peroxidase superfamilies (peroxidase-catalase, peroxidase-cyclooxygenase, peroxidase-chlorite dismutase and peroxidase-peroxygenase superfamily) arose independently during evolution, which differ in overall fold, active site architecture and enzymatic activities. The redox cofactor is heme b or posttranslationally modified heme that is ligated by either histidine or cysteine. Heme peroxidases are found in all kingdoms of life and typically catalyze the one- and two-electron oxidation of a myriad of organic and inorganic substrates. In addition to this peroxidatic activity distinct (sub)families show pronounced catalase, cyclooxygenase, chlorite dismutase or peroxygenase activities. Here we describe the phylogeny of these four superfamilies and present the most important sequence signatures and active site architectures. The classification of families is described as well as important turning points in evolution. We show that at least three heme peroxidase superfamilies have ancient prokaryotic roots with several alternative ways of divergent evolution. In later evolutionary steps, they almost always produced highly evolved and specialized clades of peroxidases in eukaryotic kingdoms with a significant portion of such genes involved in coding various fusion proteins with novel physiological functions. PMID:25575902

  10. Associations between children’s independent mobility and physical activity

    PubMed Central

    2014-01-01

    Background Independent mobility describes the freedom of children to travel and play in public spaces without adult supervision. The potential benefits for children are significant such as social interactions with peers, spatial and traffic safety skills and increased physical activity. Yet, the health benefits of independent mobility, particularly on physical activity accumulation, are largely unexplored. This study aimed to investigate associations of children’s independent mobility with light, moderate-to-vigorous, and total physical activity accumulation. Methods In 2011 - 2012, 375 Australian children aged 8-13 years (62% girls) were recruited into a cross-sectional study. Children’s independent mobility (i.e. independent travel to school and non-school destinations, independent outdoor play) and socio-demographics were assessed through child and parent surveys. Physical activity intensity was measured objectively through an Actiheart monitor worn on four consecutive days. Associations between independent mobility and physical activity variables were analysed using generalized linear models, accounting for clustered sampling, Actiheart wear time, socio-demographics, and assessing interactions by sex. Results Independent travel (walking, cycling, public transport) to school and non-school destinations were not associated with light, moderate-to-vigorous and total physical activity. However, sub-analyses revealed a positive association between independent walking and cycling (excluding public transport) to school and total physical but only in boys (b?=?36.03, p?independent outdoor play (three or more days per week) was positively associated with light and total physical activity (b?=?29.76, p?independent outdoor play and moderate-to-vigorous physical activity. When assessing differences by sex, the observed significant associations of independent outdoor play with light and total physical activity remained in girls but not in boys. All other associations showed no significant differences by sex. Conclusions Independent outdoor play may boost children’s daily physical activity levels, predominantly at light intensity. Hence, facilitating independent outdoor play could be a viable intervention strategy to enhance physical activity in children, particularly in girls. Associations between independent travel and physical activity are inconsistent overall and require further investigation. PMID:24476363

  11. 33 CFR 159.93 - Independent supporting.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) POLLUTION MARINE SANITATION DEVICES Design, Construction, and Testing § 159.93 Independent supporting. The device must have provisions for supporting that are independent from connecting pipes....

  12. New quantum MDS-convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Li, Fengwei; Yue, Qin

    2015-12-01

    In this paper, we utilize a family of Hermitian dual-containing constacyclic codes to construct classical and quantum MDS convolutional codes. Our classical and quantum convolutional codes are optimal in the sense that they attain the classical (quantum) generalized Singleton bound.

  13. On decoding of multi-level MPSK modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Gupta, Alok Kumar

    1990-01-01

    The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.

  14. Sibling and Independent Compound Chondrules

    NASA Astrophysics Data System (ADS)

    Wasson, J. T.; Krot, A. N.; Rubin, A. E.

    1993-07-01

    We studied compound chondrules in 79 cm2 of ordinary chondrite (OC) thin sections. Compound chondrules consist of a primary that solidified first and one or more secondaries attached to the primary. Sibling compound chondrules have very similar textures and compositions; most, perhaps all, seem to consist of chondrules melted in the same heating event. About 1.4% of all chondrules are the primaries of sibling compound chondrules. A smaller fraction, 1.0%, of all chondrules are the primaries of independent chondrules, the members of which were melted in separate heating events. Independent chondrules show appreciable differences in texture and/or composition. We propose that sibling chondrules originated when numerous chondrules were created from one large, more-or-less homogeneous, precursor assemblage that was flash-melted to produce a large set (perhaps 100-1000) of chondrules; some of these collided while molten, probably within several centimeters of the production site. We envision that small radial velocities were imparted to the members of the set, with small differences in velocity causing collisions among those few in intersecting trajectories. If all chondrules were produced this way, the collision efficiency was 1.4%; if only 10% were produced in this fashion, the efficiency rises to 14%. The original Gooding-Keil model of independent compound chondrule formation calls for random collisions to occur while the secondaries were molten. This appears improbable because the mean period between collisions in the dusty midplane of the nebula is estimated to be hours (or days), orders of magnitude longer than the period during which chondrules could have retained low viscosities following a flash-heating event in a cool (<700 K) nebula. We suggest that most independent compound chondrules formed by the mechanism that accounts for chondrules with relict grains and for chondrules with coarse- grained rims: the primary chondrule was embedded in a porous dust assemblage at the time of the second heating event; it experienced minimal melting because melting efficiency increases with increasing surface/volume ratio. There is a minor tendency for the FeO/(FeO+MgO) ratio in independent secondaries to be higher than in primaries, as expected if this ratio increased with time in the nebular dust. However, Monte Carlo calculations confirm that the compositions of independent secondaries are not randomly distributed, but related to those of primaries. Some exchange probably occurred during the fusion of the two chondrules, but this mechanism seems unable to account for the general similarity of independent primary/secondary compositions. This suggests that, in the environment where, at any one time, chondrules were forming (perhaps the interface between the gaseous nebula and the dusty midplane), the dust composition was more uniform than it was in the central midplane at a later time when agglomeration occurred.

  15. Qudit color codes and gauge color codes in all spatial dimensions

    NASA Astrophysics Data System (ADS)

    Watson, Fern H. E.; Campbell, Earl T.; Anwar, Hussain; Browne, Dan E.

    2015-08-01

    Two-level quantum systems, qubits, are not the only basis for quantum computation. Advantages exist in using qudits, d -level quantum systems, as the basic carrier of quantum information. We show that color codes, a class of topological quantum codes with remarkable transversality properties, can be generalized to the qudit paradigm. In recent developments it was found that in three spatial dimensions a qubit color code can support a transversal non-Clifford gate and that in higher spatial dimensions additional non-Clifford gates can be found, saturating Bravyi and König's bound [S. Bravyi and R. König, Phys. Rev. Lett. 111, 170502 (2013), 10.1103/PhysRevLett.111.170502]. Furthermore, by using gauge fixing techniques, an effective set of Clifford gates can be achieved, removing the need for state distillation. We show that the qudit color code can support the qudit analogs of these gates and also show that in higher spatial dimensions a color code can support a phase gate from higher levels of the Clifford hierarchy that can be proven to saturate Bravyi and König's bound in all but a finite number of special cases. The methodology used is a generalization of Bravyi and Haah's method of triorthogonal matrices [S. Bravyi and J. Haah, Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329], which may be of independent interest. For completeness, we show explicitly that the qudit color codes generalize to gauge color codes and share many of the favorable properties of their qubit counterparts.

  16. Method of optical image coding by time integration

    NASA Astrophysics Data System (ADS)

    Evtikhiev, Nikolay N.; Starikov, Sergey N.; Cheryomkhin, Pavel A.; Krasnov, Vitaly V.; Rodin, Vladislav G.

    2012-06-01

    Method of optical image coding by time integration is proposed. Coding in proposed method is accomplished by shifting object image over photosensor area of digital camera during registration. It results in optically calculated convolution of original image with shifts trajectory. As opposed to optical coding methods based on the use of diffractive optical elements the described coding method is feasible for implementation in totally incoherent light. The method was preliminary tested by using LC monitor for image displaying and shifting. Shifting of object image is realized by displaying video consisting of frames with image to be encoded at different locations on screen of LC monitor while registering it by camera. Optical encoding and numerical decoding of test images were performed successfully. Also more practical experimental implementation of the method with use of LCOS SLM Holoeye PLUTO VIS was realized. Objects images to be encoded were formed in monochromatic spatially incoherent light. Shifting of object image over camera photosensor area was accomplished by displaying video consisting of frames with blazed gratings on LCOS SLM. Each blazed grating deflects reflecting from SLM light at different angle. Results of image optical coding and encoded images numerical restoration are presented. Obtained experimental results are compared with results of numerical modeling. Optical image coding with time integration could be used for accessible quality estimation of optical image coding using diffractive optical elements or as independent optical coding method which can be implemented in incoherent light.

  17. ENSDF ANALYSIS AND UTILITY CODES.

    SciTech Connect

    BURROWS, T.

    2005-04-04

    The ENSDF analysis and checking codes are briefly described, along with their uses with various types of ENSDF datasets. For more information on the programs see ''Read Me'' entries and other documentation associated with each code.

  18. On lossless coding for HEVC

    NASA Astrophysics Data System (ADS)

    Gao, Wen; Jiang, Minqiang; Yu, Haoping

    2013-02-01

    In this paper, we first review the lossless coding mode in the version 1 of the HEVC standard that has recently finalized. We then provide a performance comparison between the lossless coding mode in the HEVC and MPEG-AVC/H.264 standards and show that the HEVC lossless coding has limited coding efficiency. To improve the performance of the lossless coding mode, several new coding tools that were contributed to JCT-VC but not adopted in version 1 of HEVC standard are introduced. In particular, we discuss sample based intra prediction and coding of residual coefficients in more detail. At the end, we briefly address a new class of coding tools, i.e., a dictionary-based coder, that is efficient in encoding screen content including graphics and text.

  19. Noiseless coding for the magnetometer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1987-01-01

    Future unmanned space missions will continue to seek a full understanding of magnetic fields throughout the solar system. Severely constrained data rates during certain portions of these missions could limit the possible science return. This publication investigates the application of universal noiseless coding techniques to more efficiently represent magnetometer data without any loss in data integrity. Performance results indicated that compression factors of 2:1 to 6:1 can be expected. Feasibility for general deep space application was demonstrated by implementing a microprocessor breadboard coder/decoder using the Intel 8086 processor. The Comet Rendezvous Asteroid Flyby mission will incorporate these techniques in a buffer feedback, rate-controlled configuration. The characteristics of this system are discussed.

  20. Rayleigh scatter in kilovoltage x-ray imaging: is the independent atom approximation good enough?

    PubMed

    Poludniowski, G; Evans, P M; Webb, S

    2009-11-21

    Monte Carlo simulation is the gold standard method for modelling scattering processes in medical x-ray imaging. General-purpose Monte Carlo codes, however, typically use the independent atom approximation (IAA). This is known to be inaccurate for Rayleigh scattering, for many materials, in the forward direction. This work addresses whether the IAA is sufficient for the typical modelling tasks in medical kilovoltage x-ray imaging. As a means of comparison, we incorporate a more realistic 'interference function' model into a custom-written Monte Carlo code. First, we conduct simulations of scatter from isolated voxels of soft tissue, adipose, cortical bone and spongiosa. Then, we simulate scatter profiles from a cylinder of water and from phantoms of a patient's head, thorax and pelvis, constructed from diagnostic-quality CT data sets. Lastly, we reconstruct CT numbers from simulated sets of projection images and investigate the quantitative effects of the approximation. We show that the IAA can produce errors of several per cent of the total scatter, across a projection image, for typical x-ray beams and patients. The errors in reconstructed CT number, however, for the phantoms simulated, were small (typically < 10 HU). The IAA can therefore be considered sufficient for the modelling of scatter correction in CT imaging. Where accurate quantitative estimates of scatter in individual projection images are required, however, the appropriate interference functions should be included. PMID:19887715