Combinatorial neural codes from a mathematical coding theory perspective.
Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L
2013-07-01
Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli.
Dual Coding, Reasoning and Fallacies.
ERIC Educational Resources Information Center
Hample, Dale
1982-01-01
Develops the theory that a fallacy is not a comparison of a rhetorical text to a set of definitions but a comparison of one person's cognition with another's. Reviews Paivio's dual coding theory, relates nonverbal coding to reasoning processes, and generates a limited fallacy theory based on dual coding theory. (PD)
Coding Issues in Grounded Theory
ERIC Educational Resources Information Center
Moghaddam, Alireza
2006-01-01
This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…
ERIC Educational Resources Information Center
Davis, Colin J.; Bowers, Jeffrey S.
2006-01-01
Five theories of how letter position is coded are contrasted: position-specific slot-coding, Wickelcoding, open-bigram coding (discrete and continuous), and spatial coding. These theories make different predictions regarding the relative similarity of three different types of pairs of letter strings: substitution neighbors,…
Giulio, Massimo Di
2018-05-19
A discriminative statistical test among the different theories proposed to explain the origin of the genetic code is presented. Gathering the amino acids into polarity and biosynthetic classes that are the first expression of the physicochemical theory of the origin of the genetic code and the second expression of the coevolution theory, these classes are utilized in the Fisher's exact test to establish their significance within the genetic code table. Linking to the rows and columns of the genetic code of probabilities that express the statistical significance of these classes, I have finally been in the condition to be able to calculate a χ value to link to both the physicochemical theory and to the coevolution theory that would express the corroboration level referred to these theories. The comparison between these two χ values showed that the coevolution theory is able to explain - in this strictly empirical analysis - the origin of the genetic code better than that of the physicochemical theory. Copyright © 2018 Elsevier B.V. All rights reserved.
Hoare, Karen J; Mills, Jane; Francis, Karen
2012-12-01
The terminology used to analyse data in a grounded theory study can be confusing. Different grounded theorists use a variety of terms which all have similar meanings. In the following study, we use terms adopted by Charmaz including: initial, focused and axial coding. Initial codes are used to analyse data with an emphasis on identifying gerunds, a verb acting as a noun. If initial codes are relevant to the developing theory, they are grouped with similar codes into categories. Categories become saturated when there are no new codes identified in the data. Axial codes are used to link categories together into a grounded theory process. Memo writing accompanies this data sifting and sorting. The following article explains how one initial code became a category providing a worked example of the grounded theory method of constant comparative analysis. The interplay between coding and categorization is facilitated by the constant comparative method. © 2012 Wiley Publishing Asia Pty Ltd.
Finite-block-length analysis in classical and quantum information theory.
Hayashi, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects.
Finite-block-length analysis in classical and quantum information theory
HAYASHI, Masahito
2017-01-01
Coding technology is used in several information processing tasks. In particular, when noise during transmission disturbs communications, coding technology is employed to protect the information. However, there are two types of coding technology: coding in classical information theory and coding in quantum information theory. Although the physical media used to transmit information ultimately obey quantum mechanics, we need to choose the type of coding depending on the kind of information device, classical or quantum, that is being used. In both branches of information theory, there are many elegant theoretical results under the ideal assumption that an infinitely large system is available. In a realistic situation, we need to account for finite size effects. The present paper reviews finite size effects in classical and quantum information theory with respect to various topics, including applied aspects. PMID:28302962
Creating Semantic Waves: Using Legitimation Code Theory as a Tool to Aid the Teaching of Chemistry
ERIC Educational Resources Information Center
Blackie, Margaret A. L.
2014-01-01
This is a conceptual paper aimed at chemistry educators. The purpose of this paper is to illustrate the use of the semantic code of Legitimation Code Theory in chemistry teaching. Chemistry is an abstract subject which many students struggle to grasp. Legitimation Code Theory provides a way of separating out abstraction from complexity both of…
Mechanism on brain information processing: Energy coding
NASA Astrophysics Data System (ADS)
Wang, Rubin; Zhang, Zhikang; Jiao, Xianfa
2006-09-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, the authors present a brand new scientific theory that offers a unique mechanism for brain information processing. They demonstrate that the neural coding produced by the activity of the brain is well described by the theory of energy coding. Due to the energy coding model's ability to reveal mechanisms of brain information processing based upon known biophysical properties, they cannot only reproduce various experimental results of neuroelectrophysiology but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, they estimate that the theory has very important consequences for quantitative research of cognitive function.
Energy coding in biological neural networks
Zhang, Zhikang
2007-01-01
According to the experimental result of signal transmission and neuronal energetic demands being tightly coupled to information coding in the cerebral cortex, we present a brand new scientific theory that offers an unique mechanism for brain information processing. We demonstrate that the neural coding produced by the activity of the brain is well described by our theory of energy coding. Due to the energy coding model’s ability to reveal mechanisms of brain information processing based upon known biophysical properties, we can not only reproduce various experimental results of neuro-electrophysiology, but also quantitatively explain the recent experimental results from neuroscientists at Yale University by means of the principle of energy coding. Due to the theory of energy coding to bridge the gap between functional connections within a biological neural network and energetic consumption, we estimate that the theory has very important consequences for quantitative research of cognitive function. PMID:19003513
Question 6: coevolution theory of the genetic code: a proven theory.
Wong, Jeffrey Tze-Fei
2007-10-01
The coevolution theory proposes that primordial proteins consisted only of those amino acids readily obtainable from the prebiotic environment, representing about half the twenty encoded amino acids of today, and the missing amino acids entered the system as the code expanded along with pathways of amino acid biosynthesis. The isolation of genetic code mutants, and the antiquity of pretran synthesis revealed by the comparative genomics of tRNAs and aminoacyl-tRNA synthetases, have combined to provide a rigorous proof of the four fundamental tenets of the theory, thus solving the riddle of the structure of the universal genetic code.
Di Giulio, Massimo
2017-02-07
Whereas it is extremely easy to prove that "if the biosynthetic relationships between amino acids were fundamental in the structuring of the genetic code, then their physico-chemical properties might also be revealed in the genetic code table"; it is, on the contrary, impossible to prove that "if the physico-chemical properties of amino acids were fundamental in the structuring of the genetic code, then the presence of the biosynthetic relationships between amino acids should not be revealed in the genetic code". And, given that in the genetic code table are mirrored both the biosynthetic relationships between amino acids and their physico-chemical properties, all this would be a test that would falsify the physico-chemical theories of the origin of the genetic code. That is to say, if the physico-chemical properties of amino acids had a fundamental role in organizing the genetic code, then we would not have duly revealed the presence - in the genetic code - of the biosynthetic relationships between amino acids, and on the contrary this has been observed. Therefore, this falsifies the physico-chemical theories of genetic code origin. Whereas, the coevolution theory of the origin of the genetic code would be corroborated by this analysis, because it would be able to give a description of evolution of the genetic code more coherent with the indisputable empirical observations that link both the biosynthetic relationships of amino acids and their physico-chemical properties to the evolutionary organization of the genetic code. Copyright © 2016 Elsevier Ltd. All rights reserved.
Recent advances in coding theory for near error-free communications
NASA Technical Reports Server (NTRS)
Cheung, K.-M.; Deutsch, L. J.; Dolinar, S. J.; Mceliece, R. J.; Pollara, F.; Shahshahani, M.; Swanson, L.
1991-01-01
Channel and source coding theories are discussed. The following subject areas are covered: large constraint length convolutional codes (the Galileo code); decoder design (the big Viterbi decoder); Voyager's and Galileo's data compression scheme; current research in data compression for images; neural networks for soft decoding; neural networks for source decoding; finite-state codes; and fractals for data compression.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berry, Ray Alden; Zou, Ling; Zhao, Haihua
This document summarizes the physical models and mathematical formulations used in the RELAP-7 code. In summary, the MOOSE based RELAP-7 code development is an ongoing effort. The MOOSE framework enables rapid development of the RELAP-7 code. The developmental efforts and results demonstrate that the RELAP-7 project is on a path to success. This theory manual documents the main features implemented into the RELAP-7 code. Because the code is an ongoing development effort, this RELAP-7 Theory Manual will evolve with periodic updates to keep it current with the state of the development, implementation, and model additions/revisions.
Bowers, Jeffrey S
2009-01-01
A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated representations. One of the putative advantages of this approach is that the theories are biologically plausible. Indeed, advocates of the PDP approach often highlight the close parallels between distributed representations learned in connectionist models and neural coding in brain and often dismiss localist (grandmother cell) theories as biologically implausible. The author reviews a range a data that strongly challenge this claim and shows that localist models provide a better account of single-cell recording studies. The author also contrast local and alternative distributed coding schemes (sparse and coarse coding) and argues that common rejection of grandmother cell theories in neuroscience is due to a misunderstanding about how localist models behave. The author concludes that the localist representations embedded in theories of perception and cognition are consistent with neuroscience; biology only calls into question the distributed representations often learned in PDP models.
System Design Description for the TMAD Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Finfrock, S.H.
This document serves as the System Design Description (SDD) for the TMAD Code System, which includes the TMAD code and the LIBMAKR code. The SDD provides a detailed description of the theory behind the code, and the implementation of that theory. It is essential for anyone who is attempting to review or modify the code or who otherwise needs to understand the internal workings of the code. In addition, this document includes, in Appendix A, the System Requirements Specification for the TMAD System.
Di Giulio, Massimo
2017-11-07
The coevolution theory of the origin of the genetic code suggests that the organization of the genetic code coevolved with the biosynthetic relationships between amino acids. The mechanism that allowed this coevolution was based on tRNA-like molecules on which-this theory-would postulate the biosynthetic transformations between amino acids to have occurred. This mechanism makes a prediction on how the role conducted by the aminoacyl-tRNA synthetases (ARSs), in the origin of the genetic code, should have been. Indeed, if the biosynthetic transformations between amino acids occurred on tRNA-like molecules, then there was no need to link amino acids to these molecules because amino acids were already charged on tRNA-like molecules, as the coevolution theory suggests. In spite of the fact that ARSs make the genetic code responsible for the first interaction between a component of nucleic acids and that of proteins, for the coevolution theory the role of ARSs should have been entirely marginal in the genetic code origin. Therefore, I have conducted a further analysis of the distribution of the two classes of ARSs and of their subclasses-in the genetic code table-in order to perform a falsification test of the coevolution theory. Indeed, in the case in which the distribution of ARSs within the genetic code would have been highly significant, then the coevolution theory would be falsified since the mechanism on which it is based would not predict a fundamental role of ARSs in the origin of the genetic code. I found that the statistical significance of the distribution of the two classes of ARSs in the table of the genetic code is low or marginal, whereas that of the subclasses of ARSs statistically significant. However, this is in perfect agreement with the postulates of the coevolution theory. Indeed, the only case of statistical significance-regarding the classes of ARSs-is appreciable for the CAG code, whereas for its complement-the UNN/NUN code-only a marginal significance is measurable. These two codes codify roughly for the two ARS classes, in particular, the CAG code for the class II while the UNN/NUN code for the class I. Furthermore, the subclasses of ARSs show a statistical significance of their distribution in the genetic code table. Nevertheless, the more sensible explanation for these observations would be the following. The observation that would link the two classes of ARSs to the CAG and UNN/NUN codes, and the statistical significance of the distribution of the subclasses of ARSs in the genetic code table, would be only a secondary effect due to the highly significant distribution of the polarity of amino acids and their biosynthetic relationships in the genetic code. That is to say, the polarity of amino acids and their biosynthetic relationships would have conditioned the evolution of ARSs so that their presence in the genetic code would have been detectable. Even if the ARSs would not have-on their own-influenced directly the evolutionary organization of the genetic code. In other words, the role that ARSs had in the origin of the genetic code would have been entirely marginal. This conclusion would be in perfect accord with the predictions of the coevolution theory. Conversely, this conclusion would be in contrast-at least partially-with the physicochemical theories of the origin of the genetic code because they would foresee an absolutely more active role of ARSs in the origin of the organization of the genetic code. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Survey of Progress in Coding Theory in the Soviet Union. Final Report.
ERIC Educational Resources Information Center
Kautz, William H.; Levitt, Karl N.
The results of a comprehensive technical survey of all published Soviet literature in coding theory and its applications--over 400 papers and books appearing before March 1967--are described in this report. Noteworthy Soviet contributions are discussed, including codes for the noiseless channel, codes that correct asymetric errors, decoding for…
Dual coding theory, word abstractness, and emotion: a critical review of Kousta et al. (2011).
Paivio, Allan
2013-02-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of abstract concepts. Their dual coding critique is, however, based on impoverished and, in some respects, incorrect interpretations of the theory and its implications. This response corrects those gaps and misinterpretations and summarizes research findings that show predicted variations in the effects of dual coding variables in different tasks and contexts. Especially emphasized is an empirically supported dual coding theory of emotion that goes beyond the Kousta et al. emphasis on emotion in abstract semantics. 2013 APA, all rights reserved
An optimization program based on the method of feasible directions: Theory and users guide
NASA Technical Reports Server (NTRS)
Belegundu, Ashok D.; Berke, Laszlo; Patnaik, Surya N.
1994-01-01
The theory and user instructions for an optimization code based on the method of feasible directions are presented. The code was written for wide distribution and ease of attachment to other simulation software. Although the theory of the method of feasible direction was developed in the 1960's, many considerations are involved in its actual implementation as a computer code. Included in the code are a number of features to improve robustness in optimization. The search direction is obtained by solving a quadratic program using an interior method based on Karmarkar's algorithm. The theory is discussed focusing on the important and often overlooked role played by the various parameters guiding the iterations within the program. Also discussed is a robust approach for handling infeasible starting points. The code was validated by solving a variety of structural optimization test problems that have known solutions obtained by other optimization codes. It has been observed that this code is robust: it has solved a variety of problems from different starting points. However, the code is inefficient in that it takes considerable CPU time as compared with certain other available codes. Further work is required to improve its efficiency while retaining its robustness.
ERIC Educational Resources Information Center
Rupley, William H.; Paige, David D.; Rasinski, Timothy V.; Slough, Scott W.
2015-01-01
Pavio's Dual-Coding Theory (1991) and Mayer's Multimedia Principal (2000) form the foundation for proposing a multi-coding theory centered around Multi-Touch Tablets and the newest generation of e-textbooks to scaffold struggling readers in reading and learning from science textbooks. Using E. O. Wilson's "Life on Earth: An Introduction"…
An analysis of the metabolic theory of the origin of the genetic code
NASA Technical Reports Server (NTRS)
Amirnovin, R.; Bada, J. L. (Principal Investigator)
1997-01-01
A computer program was used to test Wong's coevolution theory of the genetic code. The codon correlations between the codons of biosynthetically related amino acids in the universal genetic code and in randomly generated genetic codes were compared. It was determined that many codon correlations are also present within random genetic codes and that among the random codes there are always several which have many more correlations than that found in the universal code. Although the number of correlations depends on the choice of biosynthetically related amino acids, the probability of choosing a random genetic code with the same or greater number of codon correlations as the universal genetic code was found to vary from 0.1% to 34% (with respect to a fairly complete listing of related amino acids). Thus, Wong's theory that the genetic code arose by coevolution with the biosynthetic pathways of amino acids, based on codon correlations between biosynthetically related amino acids, is statistical in nature.
The general theory of convolutional codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Stanley, R. P.
1993-01-01
This article presents a self-contained introduction to the algebraic theory of convolutional codes. This introduction is partly a tutorial, but at the same time contains a number of new results which will prove useful for designers of advanced telecommunication systems. Among the new concepts introduced here are the Hilbert series for a convolutional code and the class of compact codes.
The mathematical theory of signal processing and compression-designs
NASA Astrophysics Data System (ADS)
Feria, Erlan H.
2006-05-01
The mathematical theory of signal processing, named processor coding, will be shown to inherently arise as the computational time dual of Shannon's mathematical theory of communication which is also known as source coding. Source coding is concerned with signal source memory space compression while processor coding deals with signal processor computational time compression. Their combination is named compression-designs and referred as Conde in short. A compelling and pedagogically appealing diagram will be discussed highlighting Conde's remarkable successful application to real-world knowledge-aided (KA) airborne moving target indicator (AMTI) radar.
An extension of the coevolution theory of the origin of the genetic code
Di Giulio, Massimo
2008-01-01
Background The coevolution theory of the origin of the genetic code suggests that the genetic code is an imprint of the biosynthetic relationships between amino acids. However, this theory does not seem to attribute a role to the biosynthetic relationships between the earliest amino acids that evolved along the pathways of energetic metabolism. As a result, the coevolution theory is unable to clearly define the very earliest phases of genetic code origin. In order to remove this difficulty, I here suggest an extension of the coevolution theory that attributes a crucial role to the first amino acids that evolved along these biosynthetic pathways and to their biosynthetic relationships, even when defined by the non-amino acid molecules that are their precursors. Results It is re-observed that the first amino acids to evolve along these biosynthetic pathways are predominantly those codified by codons of the type GNN, and this observation is found to be statistically significant. Furthermore, the close biosynthetic relationships between the sibling amino acids Ala-Ser, Ser-Gly, Asp-Glu, and Ala-Val are not random in the genetic code table and reinforce the hypothesis that the biosynthetic relationships between these six amino acids played a crucial role in defining the very earliest phases of genetic code origin. Conclusion All this leads to the hypothesis that there existed a code, GNS, reflecting the biosynthetic relationships between these six amino acids which, as it defines the very earliest phases of genetic code origin, removes the main difficulty of the coevolution theory. Furthermore, it is here discussed how this code might have naturally led to the code codifying only for the domains of the codons of precursor amino acids, as predicted by the coevolution theory. Finally, the hypothesis here suggested also removes other problems of the coevolution theory, such as the existence for certain pairs of amino acids with an unclear biosynthetic relationship between the precursor and product amino acids and the collocation of Ala between the amino acids Val and Leu belonging to the pyruvate biosynthetic family, which the coevolution theory considered as belonging to different biosyntheses. Reviewers This article was reviewed by Rob Knight, Paul Higgs (nominated by Laura Landweber), and Eugene Koonin. PMID:18775066
A Critique of Schema Theory in Reading and a Dual Coding Alternative (Commentary).
ERIC Educational Resources Information Center
Sadoski, Mark; And Others
1991-01-01
Evaluates schema theory and presents dual coding theory as a theoretical alternative. Argues that schema theory is encumbered by lack of a consistent definition, its roots in idealist epistemology, and mixed empirical support. Argues that results of many empirical studies used to demonstrate the existence of schemata are more consistently…
Evaluation of Persons of Varying Ages.
ERIC Educational Resources Information Center
Stolte, John F.
1996-01-01
Reviews two experiments that strongly support dual coding theory. Dual coding theory holds that communicating concretely (tactile, auditory, or visual stimuli) affects evaluative thinking stronger than communicating abstractly through words and numbers. The experiments applied this theory to the realm of age and evaluation. (MJP)
Computational Nuclear Physics and Post Hartree-Fock Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lietz, Justin; Sam, Novario; Hjorth-Jensen, M.
We present a computational approach to infinite nuclear matter employing Hartree-Fock theory, many-body perturbation theory and coupled cluster theory. These lectures are closely linked with those of chapters 9, 10 and 11 and serve as input for the correlation functions employed in Monte Carlo calculations in chapter 9, the in-medium similarity renormalization group theory of dense fermionic systems of chapter 10 and the Green's function approach in chapter 11. We provide extensive code examples and benchmark calculations, allowing thereby an eventual reader to start writing her/his own codes. We start with an object-oriented serial code and end with discussions onmore » strategies for porting the code to present and planned high-performance computing facilities.« less
NASA Technical Reports Server (NTRS)
Chambers, Lin Hartung
1994-01-01
The theory for radiation emission, absorption, and transfer in a thermochemical nonequilibrium flow is presented. The expressions developed reduce correctly to the limit at equilibrium. To implement the theory in a practical computer code, some approximations are used, particularly the smearing of molecular radiation. Details of these approximations are presented and helpful information is included concerning the use of the computer code. This user's manual should benefit both occasional users of the Langley Optimized Radiative Nonequilibrium (LORAN) code and those who wish to use it to experiment with improved models or properties.
Correcting quantum errors with entanglement.
Brun, Todd; Devetak, Igor; Hsieh, Min-Hsiu
2006-10-20
We show how entanglement shared between encoder and decoder can simplify the theory of quantum error correction. The entanglement-assisted quantum codes we describe do not require the dual-containing constraint necessary for standard quantum error-correcting codes, thus allowing us to "quantize" all of classical linear coding theory. In particular, efficient modern classical codes that attain the Shannon capacity can be made into entanglement-assisted quantum codes attaining the hashing bound (closely related to the quantum capacity). For systems without large amounts of shared entanglement, these codes can also be used as catalytic codes, in which a small amount of initial entanglement enables quantum communication.
Phenotypic Graphs and Evolution Unfold the Standard Genetic Code as the Optimal
NASA Astrophysics Data System (ADS)
Zamudio, Gabriel S.; José, Marco V.
2018-03-01
In this work, we explicitly consider the evolution of the Standard Genetic Code (SGC) by assuming two evolutionary stages, to wit, the primeval RNY code and two intermediate codes in between. We used network theory and graph theory to measure the connectivity of each phenotypic graph. The connectivity values are compared to the values of the codes under different randomization scenarios. An error-correcting optimal code is one in which the algebraic connectivity is minimized. We show that the SGC is optimal in regard to its robustness and error-tolerance when compared to all random codes under different assumptions.
n-Nucleotide circular codes in graph theory.
Fimmel, Elena; Michel, Christian J; Strüngmann, Lutz
2016-03-13
The circular code theory proposes that genes are constituted of two trinucleotide codes: the classical genetic code with 61 trinucleotides for coding the 20 amino acids (except the three stop codons {TAA,TAG,TGA}) and a circular code based on 20 trinucleotides for retrieving, maintaining and synchronizing the reading frame. It relies on two main results: the identification of a maximal C(3) self-complementary trinucleotide circular code X in genes of bacteria, eukaryotes, plasmids and viruses (Michel 2015 J. Theor. Biol. 380, 156-177. (doi:10.1016/j.jtbi.2015.04.009); Arquès & Michel 1996 J. Theor. Biol. 182, 45-58. (doi:10.1006/jtbi.1996.0142)) and the finding of X circular code motifs in tRNAs and rRNAs, in particular in the ribosome decoding centre (Michel 2012 Comput. Biol. Chem. 37, 24-37. (doi:10.1016/j.compbiolchem.2011.10.002); El Soufi & Michel 2014 Comput. Biol. Chem. 52, 9-17. (doi:10.1016/j.compbiolchem.2014.08.001)). The univerally conserved nucleotides A1492 and A1493 and the conserved nucleotide G530 are included in X circular code motifs. Recently, dinucleotide circular codes were also investigated (Michel & Pirillo 2013 ISRN Biomath. 2013, 538631. (doi:10.1155/2013/538631); Fimmel et al. 2015 J. Theor. Biol. 386, 159-165. (doi:10.1016/j.jtbi.2015.08.034)). As the genetic motifs of different lengths are ubiquitous in genes and genomes, we introduce a new approach based on graph theory to study in full generality n-nucleotide circular codes X, i.e. of length 2 (dinucleotide), 3 (trinucleotide), 4 (tetranucleotide), etc. Indeed, we prove that an n-nucleotide code X is circular if and only if the corresponding graph [Formula: see text] is acyclic. Moreover, the maximal length of a path in [Formula: see text] corresponds to the window of nucleotides in a sequence for detecting the correct reading frame. Finally, the graph theory of tournaments is applied to the study of dinucleotide circular codes. It has full equivalence between the combinatorics theory (Michel & Pirillo 2013 ISRN Biomath. 2013, 538631. (doi:10.1155/2013/538631)) and the group theory (Fimmel et al. 2015 J. Theor. Biol. 386, 159-165. (doi:10.1016/j.jtbi.2015.08.034)) of dinucleotide circular codes while its mathematical approach is simpler. © 2016 The Author(s).
Enhancing Undergraduate Mathematics Curriculum via Coding Theory and Cryptography
ERIC Educational Resources Information Center
Aydin, Nuh
2009-01-01
The theory of error-correcting codes and cryptography are two relatively recent applications of mathematics to information and communication systems. The mathematical tools used in these fields generally come from algebra, elementary number theory, and combinatorics, including concepts from computational complexity. It is possible to introduce the…
Factors Affecting Christian Parents' School Choice Decision Processes: A Grounded Theory Study
ERIC Educational Resources Information Center
Prichard, Tami G.; Swezey, James A.
2016-01-01
This study identifies factors affecting the decision processes for school choice by Christian parents. Grounded theory design incorporated interview transcripts, field notes, and a reflective journal to analyze themes. Comparative analysis, including open, axial, and selective coding, was used to reduce the coded statements to five code families:…
Dual coding: a cognitive model for psychoanalytic research.
Bucci, W
1985-01-01
Four theories of mental representation derived from current experimental work in cognitive psychology have been discussed in relation to psychoanalytic theory. These are: verbal mediation theory, in which language determines or mediates thought; perceptual dominance theory, in which imagistic structures are dominant; common code or propositional models, in which all information, perceptual or linguistic, is represented in an abstract, amodal code; and dual coding, in which nonverbal and verbal information are each encoded, in symbolic form, in separate systems specialized for such representation, and connected by a complex system of referential relations. The weight of current empirical evidence supports the dual code theory. However, psychoanalysis has implicitly accepted a mixed model-perceptual dominance theory applying to unconscious representation, and verbal mediation characterizing mature conscious waking thought. The characterization of psychoanalysis, by Schafer, Spence, and others, as a domain in which reality is constructed rather than discovered, reflects the application of this incomplete mixed model. The representations of experience in the patient's mind are seen as without structure of their own, needing to be organized by words, thus vulnerable to distortion or dissolution by the language of the analyst or the patient himself. In these terms, hypothesis testing becomes a meaningless pursuit; the propositions of the theory are no longer falsifiable; the analyst is always more or less "right." This paper suggests that the integrated dual code formulation provides a more coherent theoretical framework for psychoanalysis than the mixed model, with important implications for theory and technique. In terms of dual coding, the problem is not that the nonverbal representations are vulnerable to distortion by words, but that the words that pass back and forth between analyst and patient will not affect the nonverbal schemata at all. Using the dual code formulation, and applying an investigative methodology derived from experimental cognitive psychology, a new approach to the verification of interpretations is possible. Some constructions of a patient's story may be seen as more accurate than others, by virtue of their linkage to stored perceptual representations in long-term memory. We can demonstrate that such linking has occurred in functional or operational terms--through evaluating the representation of imagistic content in the patient's speech.
A Nonvolume Preserving Plasticity Theory with Applications to Powder Metallurgy
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1983-01-01
A plasticity theory has been developed to predict the mechanical response of powder metals during hot isostatic pressing. The theory parameters were obtained through an experimental program consisting of hydrostatic pressure tests, uniaxial compression and uniaxial tension tests. A nonlinear finite element code was modified to include the theory and the results of themodified code compared favorably to the results from a verification experiment.
Experimental evaluation of a flat wake theory for predicting rotor inflow-wake velocities
NASA Technical Reports Server (NTRS)
Wilson, John C.
1992-01-01
The theory for predicting helicopter inflow-wake velocities called flat wake theory was correlated with several sets of experimental data. The theory was developed by V. E. Baskin of the USSR, and a computer code known as DOWN was developed at Princeton University to implement the theory. The theory treats the wake geometry as rigid without interaction between induced velocities and wake structure. The wake structure is assumed to be a flat sheet of vorticity composed of trailing elements whose strength depends on the azimuthal and radial distributions of circulation on a rotor blade. The code predicts the three orthogonal components of flow velocity in the field surrounding the rotor. The predictions can be utilized in rotor performance and helicopter real-time flight-path simulation. The predictive capability of the coded version of flat wake theory provides vertical inflow patterns similar to experimental patterns.
Concreteness effects in semantic processing: ERP evidence supporting dual-coding theory.
Kounios, J; Holcomb, P J
1994-07-01
Dual-coding theory argues that processing advantages for concrete over abstract (verbal) stimuli result from the operation of 2 systems (i.e., imaginal and verbal) for concrete stimuli, rather than just 1 (for abstract stimuli). These verbal and imaginal systems have been linked with the left and right hemispheres of the brain, respectively. Context-availability theory argues that concreteness effects result from processing differences in a single system. The merits of these theories were investigated by examining the topographic distribution of event-related brain potentials in 2 experiments (lexical decision and concrete-abstract classification). The results were most consistent with dual-coding theory. In particular, different scalp distributions of an N400-like negativity were elicited by concrete and abstract words.
Giles, Tracey M; de Lacey, Sheryl; Muir-Cochrane, Eimear
2016-01-01
Grounded theory method has been described extensively in the literature. Yet, the varying processes portrayed can be confusing for novice grounded theorists. This article provides a worked example of the data analysis phase of a constructivist grounded theory study that examined family presence during resuscitation in acute health care settings. Core grounded theory methods are exemplified, including initial and focused coding, constant comparative analysis, memo writing, theoretical sampling, and theoretical saturation. The article traces the construction of the core category "Conditional Permission" from initial and focused codes, subcategories, and properties, through to its position in the final substantive grounded theory.
ERIC Educational Resources Information Center
Sadoski, Mark; And Others
1993-01-01
The comprehensibility, interestingness, familiarity, and memorability of concrete and abstract instructional texts were studied in 4 experiments involving 221 college students. Results indicate that concreteness (ease of imagery) is the variable overwhelmingly most related to comprehensibility and recall. Dual coding theory and schema theory are…
Android application for handwriting segmentation using PerTOHS theory
NASA Astrophysics Data System (ADS)
Akouaydi, Hanen; Njah, Sourour; Alimi, Adel M.
2017-03-01
The paper handles the problem of segmentation of handwriting on mobile devices. Many applications have been developed in order to facilitate the recognition of handwriting and to skip the limited numbers of keys in keyboards and try to introduce a space of drawing for writing instead of using keyboards. In this one, we will present a mobile theory for the segmentation of for handwriting uses PerTOHS theory, Perceptual Theory of On line Handwriting Segmentation, where handwriting is defined as a sequence of elementary and perceptual codes. In fact, the theory analyzes the written script and tries to learn the handwriting visual codes features in order to generate new ones via the generated perceptual sequences. To get this classification we try to apply the Beta-elliptic model, fuzzy detector and also genetic algorithms in order to get the EPCs (Elementary Perceptual Codes) and GPCs (Global Perceptual Codes) that composed the script. So, we will present our Android application M-PerTOHS for segmentation of handwriting.
Information Theory, Inference and Learning Algorithms
NASA Astrophysics Data System (ADS)
Mackay, David J. C.
2003-10-01
Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.
Qualitative Data Analysis for Health Services Research: Developing Taxonomy, Themes, and Theory
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-01-01
Objective To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. Data Sources and Design We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. Principle Findings We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Conclusions Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines. PMID:17286625
Qualitative data analysis for health services research: developing taxonomy, themes, and theory.
Bradley, Elizabeth H; Curry, Leslie A; Devers, Kelly J
2007-08-01
To provide practical strategies for conducting and evaluating analyses of qualitative data applicable for health services researchers. DATA SOURCES AND DESIGN: We draw on extant qualitative methodological literature to describe practical approaches to qualitative data analysis. Approaches to data analysis vary by discipline and analytic tradition; however, we focus on qualitative data analysis that has as a goal the generation of taxonomy, themes, and theory germane to health services research. We describe an approach to qualitative data analysis that applies the principles of inductive reasoning while also employing predetermined code types to guide data analysis and interpretation. These code types (conceptual, relationship, perspective, participant characteristics, and setting codes) define a structure that is appropriate for generation of taxonomy, themes, and theory. Conceptual codes and subcodes facilitate the development of taxonomies. Relationship and perspective codes facilitate the development of themes and theory. Intersectional analyses with data coded for participant characteristics and setting codes can facilitate comparative analyses. Qualitative inquiry can improve the description and explanation of complex, real-world phenomena pertinent to health services research. Greater understanding of the processes of qualitative data analysis can be helpful for health services researchers as they use these methods themselves or collaborate with qualitative researchers from a wide range of disciplines.
Paivio, Allan; Sadoski, Mark
2011-01-01
Elman (2009) proposed that the traditional role of the mental lexicon in language processing can largely be replaced by a theoretical model of schematic event knowledge founded on dynamic context-dependent variables. We evaluate Elman's approach and propose an alternative view, based on dual coding theory and evidence that modality-specific cognitive representations contribute strongly to word meaning and language performance across diverse contexts which also have effects predictable from dual coding theory. Copyright © 2010 Cognitive Science Society, Inc.
A Theoretical Analysis of Learning with Graphics--Implications for Computer Graphics Design.
ERIC Educational Resources Information Center
ChanLin, Lih-Juan
This paper reviews the literature pertinent to learning with graphics. The dual coding theory provides explanation about how graphics are stored and precessed in semantic memory. The level of processing theory suggests how graphics can be employed in learning to encourage deeper processing. In addition to dual coding theory and level of processing…
ERIC Educational Resources Information Center
Bowers, Jeffrey S.
2009-01-01
A fundamental claim associated with parallel distributed processing (PDP) theories of cognition is that knowledge is coded in a distributed manner in mind and brain. This approach rejects the claim that knowledge is coded in a localist fashion, with words, objects, and simple concepts (e.g. "dog"), that is, coded with their own dedicated…
A new theory of development: the generation of complexity in ontogenesis.
Barbieri, Marcello
2016-03-13
Today there is a very wide consensus on the idea that embryonic development is the result of a genetic programme and of epigenetic processes. Many models have been proposed in this theoretical framework to account for the various aspects of development, and virtually all of them have one thing in common: they do not acknowledge the presence of organic codes (codes between organic molecules) in ontogenesis. Here it is argued instead that embryonic development is a convergent increase in complexity that necessarily requires organic codes and organic memories, and a few examples of such codes are described. This is the code theory of development, a theory that was originally inspired by an algorithm that is capable of reconstructing structures from incomplete information, an algorithm that here is briefly summarized because it makes it intuitively appealing how a convergent increase in complexity can be achieved. The main thesis of the new theory is that the presence of organic codes in ontogenesis is not only a theoretical necessity but, first and foremost, an idea that can be tested and that has already been found to be in agreement with the evidence. © 2016 The Author(s).
ERIC Educational Resources Information Center
Green, Crystal D.
2010-01-01
This action research study investigated the perceptions that student participants had on the development of a career exploration model and a career exploration project. The Holland code theory was the primary assessment used for this research study, in addition to the Multiple Intelligences theory and the identification of a role model for the…
Using grounded theory to create a substantive theory of promoting schoolchildren's mental health.
Puolakka, Kristiina; Haapasalo-Pesu, Kirsi-Maria; Kiikkala, Irma; Astedt-Kurki, Päivi; Paavilainen, Eija
2013-01-01
To discuss the creation of a substantive theory using grounded theory. This article provides an example of generating theory from a study of mental health promotion at a high school in Finland. Grounded theory is a method for creating explanatory theory. It is a valuable tool for health professionals when studying phenomena that affect patients' health, offering a deeper understanding of nursing methods and knowledge. Interviews with school employees, students and parents, and verbal responses to the 'school wellbeing profile survey', as well as working group memos related to the development activities. Participating children were aged between 12 and 15. The analysis was conducted by applying the grounded theory method and involved open coding of the material, constant comparison, axial coding and selective coding after identifying the core category. The analysis produced concepts about mental health promotion in school and assumptions about relationships. Grounded theory proved to be an effective means of eliciting people's viewpoints on mental health promotion. The personal views of different parties make it easier to identify an action applicable to practice.
Elder, D
1984-06-07
The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward.
Di Giulio, Massimo
2016-06-21
I analyze the mechanism on which are based the majority of theories that put to the center of the origin of the genetic code the physico-chemical properties of amino acids. As this mechanism is based on excessive mutational steps, I conclude that it could not have been operative or if operative it would not have allowed a full realization of predictions of these theories, because this mechanism contained, evidently, a high indeterminacy. I make that disapproving the four-column theory of the origin of the genetic code (Higgs, 2009) and reply to the criticism that was directed towards the coevolution theory of the origin of the genetic code. In this context, I suggest a new hypothesis that clarifies the mechanism by which the domains of codons of the precursor amino acids would have evolved, as predicted by the coevolution theory. This mechanism would have used particular elongation factors that would have constrained the evolution of all amino acids belonging to a given biosynthetic family to the progenitor pre-tRNA, that for first recognized, the first codons that evolved in a certain codon domain of a determined precursor amino acid. This happened because the elongation factors recognized two characteristics of the progenitor pre-tRNAs of precursor amino acids, which prevented the elongation factors from recognizing the pre-tRNAs belonging to biosynthetic families of different precursor amino acids. Finally, I analyze by means of Fisher's exact test, the distribution, within the genetic code, of the biosynthetic classes of amino acids and the ones of polarity values of amino acids. This analysis would seem to support the biosynthetic classes of amino acids over the ones of polarity values, as the main factor that led to the structuring of the genetic code, with the physico-chemical properties of amino acids playing only a subsidiary role in this evolution. As a whole, the full analysis brings to the conclusion that the coevolution theory of the origin of the genetic code would be a theory highly corroborated. Copyright © 2016 Elsevier Ltd. All rights reserved.
Applications of potential theory computations to transonic aeroelasticity
NASA Technical Reports Server (NTRS)
Edwards, J. W.
1986-01-01
Unsteady aerodynamic and aeroelastic stability calculations based upon transonic small disturbance (TSD) potential theory are presented. Results from the two-dimensional XTRAN2L code and the three-dimensional XTRAN3S code are compared with experiment to demonstrate the ability of TSD codes to treat transonic effects. The necessity of nonisentropic corrections to transonic potential theory is demonstrated. Dynamic computational effects resulting from the choice of grid and boundary conditions are illustrated. Unsteady airloads for a number of parameter variations including airfoil shape and thickness, Mach number, frequency, and amplitude are given. Finally, samples of transonic aeroelastic calculations are given. A key observation is the extent to which unsteady transonic airloads calculated by inviscid potential theory may be treated in a locally linear manner.
The analysis of convolutional codes via the extended Smith algorithm
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Onyszchuk, I.
1993-01-01
Convolutional codes have been the central part of most error-control systems in deep-space communication for many years. Almost all such applications, however, have used the restricted class of (n,1), also known as 'rate 1/n,' convolutional codes. The more general class of (n,k) convolutional codes contains many potentially useful codes, but their algebraic theory is difficult and has proved to be a stumbling block in the evolution of convolutional coding systems. In this article, the situation is improved by describing a set of practical algorithms for computing certain basic things about a convolutional code (among them the degree, the Forney indices, a minimal generator matrix, and a parity-check matrix), which are usually needed before a system using the code can be built. The approach is based on the classic Forney theory for convolutional codes, together with the extended Smith algorithm for polynomial matrices, which is introduced in this article.
From Theory to Practice: Measuring end-of-life communication quality using multiple goals theory.
Van Scoy, L J; Scott, A M; Reading, J M; Chuang, C H; Chinchilli, V M; Levi, B H; Green, M J
2017-05-01
To describe how multiple goals theory can be used as a reliable and valid measure (i.e., coding scheme) of the quality of conversations about end-of-life issues. We analyzed conversations from 17 conversations in which 68 participants (mean age=51years) played a game that prompted discussion in response to open-ended questions about end-of-life issues. Conversations (mean duration=91min) were audio-recorded and transcribed. Communication quality was assessed by three coders who assigned numeric scores rating how well individuals accomplished task, relational, and identity goals in the conversation. The coding measure, which results in a quantifiable outcome, yielded strong reliability (intra-class correlation range=0.73-0.89 and Cronbach's alpha range=0.69-0.89 for each of the coded domains) and validity (using multilevel nonlinear modeling, we detected significant variability in scores between games for each of the coded domains, all p-values <0.02). Our coding scheme provides a theory-based measure of end-of-life conversation quality that is superior to other methods of measuring communication quality. Our description of the coding method enables researches to adapt and apply this measure to communication interventions in other clinical contexts. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
A STRUCTURAL THEORY FOR THE PERCEPTION OF MORSE CODE SIGNALS AND RELATED RHYTHMIC PATTERNS.
ERIC Educational Resources Information Center
WISH, MYRON
THE PRIMARY PURPOSE OF THIS DISSERTATION IS TO DEVELOP A STRUCTURAL THEORY, ALONG FACET-THEORETIC LINES, FOR THE PERCEPTION OF MORSE CODE SIGNALS AND RELATED RHYTHMIC PATTERNS. AS STEPS IN THE DEVELOPMENT OF THIS THEORY, MODELS FOR TWO SETS OF SIGNALS ARE PROPOSED AND TESTED. THE FIRST MODEL IS FOR A SET COMPRISED OF ALL SIGNALS OF THE…
NASA Technical Reports Server (NTRS)
Silva, Walter A.
1993-01-01
The presentation begins with a brief description of the motivation and approach that has been taken for this research. This will be followed by a description of the Volterra Theory of Nonlinear Systems and the CAP-TSD code which is an aeroelastic, transonic CFD (Computational Fluid Dynamics) code. The application of the Volterra theory to a CFD model and, more specifically, to a CAP-TSD model of a rectangular wing with a NACA 0012 airfoil section will be presented.
With or without you: predictive coding and Bayesian inference in the brain
Aitchison, Laurence; Lengyel, Máté
2018-01-01
Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084
Theory of Mind: A Neural Prediction Problem
Koster-Hale, Jorie; Saxe, Rebecca
2014-01-01
Predictive coding posits that neural systems make forward-looking predictions about incoming information. Neural signals contain information not about the currently perceived stimulus, but about the difference between the observed and the predicted stimulus. We propose to extend the predictive coding framework from high-level sensory processing to the more abstract domain of theory of mind; that is, to inferences about others’ goals, thoughts, and personalities. We review evidence that, across brain regions, neural responses to depictions of human behavior, from biological motion to trait descriptions, exhibit a key signature of predictive coding: reduced activity to predictable stimuli. We discuss how future experiments could distinguish predictive coding from alternative explanations of this response profile. This framework may provide an important new window on the neural computations underlying theory of mind. PMID:24012000
Code Switching and Sexual Orientation: A Test of Bernstein's Sociolinguistic Theory
ERIC Educational Resources Information Center
Lumby, Malcolm E.
1976-01-01
Bernstein's theory was tested in the homosexual's "closed" community to determine code-switching ability and its relationship to jargon. Subjects told a story based on homoerotic photographs where knowledge of sexual orientation was varied. Results suggest that homosexual homophyly encouraged elaboration. (Author)
2011-01-01
Background Electronic patient records are generally coded using extensive sets of codes but the significance of the utilisation of individual codes may be unclear. Item response theory (IRT) models are used to characterise the psychometric properties of items included in tests and questionnaires. This study asked whether the properties of medical codes in electronic patient records may be characterised through the application of item response theory models. Methods Data were provided by a cohort of 47,845 participants from 414 family practices in the UK General Practice Research Database (GPRD) with a first stroke between 1997 and 2006. Each eligible stroke code, out of a set of 202 OXMIS and Read codes, was coded as either recorded or not recorded for each participant. A two parameter IRT model was fitted using marginal maximum likelihood estimation. Estimated parameters from the model were considered to characterise each code with respect to the latent trait of stroke diagnosis. The location parameter is referred to as a calibration parameter, while the slope parameter is referred to as a discrimination parameter. Results There were 79,874 stroke code occurrences available for analysis. Utilisation of codes varied between family practices with intraclass correlation coefficients of up to 0.25 for the most frequently used codes. IRT analyses were restricted to 110 Read codes. Calibration and discrimination parameters were estimated for 77 (70%) codes that were endorsed for 1,942 stroke patients. Parameters were not estimated for the remaining more frequently used codes. Discrimination parameter values ranged from 0.67 to 2.78, while calibration parameters values ranged from 4.47 to 11.58. The two parameter model gave a better fit to the data than either the one- or three-parameter models. However, high chi-square values for about a fifth of the stroke codes were suggestive of poor item fit. Conclusion The application of item response theory models to coded electronic patient records might potentially contribute to identifying medical codes that offer poor discrimination or low calibration. This might indicate the need for improved coding sets or a requirement for improved clinical coding practice. However, in this study estimates were only obtained for a small proportion of participants and there was some evidence of poor model fit. There was also evidence of variation in the utilisation of codes between family practices raising the possibility that, in practice, properties of codes may vary for different coders. PMID:22176509
Topics in the Sequential Design of Experiments
1992-03-01
decision , unless so designated by other documentation. 12a. DISTRIBUTION /AVAILABIIUTY STATEMENT 12b. DISTRIBUTION CODE Approved for public release...3 0 1992 D 14. SUBJECT TERMS 15. NUMBER OF PAGES12 Design of Experiments, Renewal Theory , Sequential Testing 1 2. PRICE CODE Limit Theory , Local...distributions for one parameter exponential families," by Michael Woodroofe. Stntca, 2 (1991), 91-112. [6] "A non linear renewal theory for a functional of
NASA Technical Reports Server (NTRS)
Hartle, M.; McKnight, R. L.
2000-01-01
This manual is a combination of a user manual, theory manual, and programmer manual. The reader is assumed to have some previous exposure to the finite element method. This manual is written with the idea that the CSTEM (Coupled Structural Thermal Electromagnetic-Computer Code) user needs to have a basic understanding of what the code is actually doing in order to properly use the code. For that reason, the underlying theory and methods used in the code are described to a basic level of detail. The manual gives an overview of the CSTEM code: how the code came into existence, a basic description of what the code does, and the order in which it happens (a flowchart). Appendices provide a listing and very brief description of every file used by the CSTEM code, including the type of file it is, what routine regularly accesses the file, and what routine opens the file, as well as special features included in CSTEM.
NASA Technical Reports Server (NTRS)
Chandrasekaran, B.
1986-01-01
This document is the user's guide for the method developed earlier for predicting the slipstream wing interaction at subsonic speeds. The analysis involves a subsonic panel code (HESS code) modified to handle the propeller onset flow. The propfan slipstream effects are superimposed on the normal flow boundary condition and are applied over the surface washed by the slipstream. The effects of the propeller slipstream are to increase the axial induced velocity, tangential velocity, and a total pressure rise in the wake of the propeller. Principles based on blade performance theory, momentum theory, and vortex theory were used to evaluate the slipstream effects. The code can be applied to any arbitrary three dimensional geometry, expressed in the form of HESS input format. The code can handle a propeller alone configuration or a propeller/nacelle/airframe configuration, operating up to high subcritical Mach numbers over a range of angles of attack. Inclusion of a viscous modelling is briefly outlined. Wind tunnel results/theory comparisons are included as examples for the application of the code to a generic supercritical wing/overwing Nacelle with a powered propfan. A sample input/output listing is provided.
Bernstein's "Codes" and the Linguistics of "Deficit"
ERIC Educational Resources Information Center
Jones, Peter E.
2013-01-01
This paper examines the key linguistic arguments underpinning Basil Bernstein's theory of "elaborated" and "restricted" "codes". Building on a review of selected highlights from the collective critical response to Bernstein, the paper attempts to clarify the relationship of the theory to "deficit" views…
Great Expectations: Is there Evidence for Predictive Coding in Auditory Cortex?
Heilbron, Micha; Chait, Maria
2017-08-04
Predictive coding is possibly one of the most influential, comprehensive, and controversial theories of neural function. While proponents praise its explanatory potential, critics object that key tenets of the theory are untested or even untestable. The present article critically examines existing evidence for predictive coding in the auditory modality. Specifically, we identify five key assumptions of the theory and evaluate each in the light of animal, human and modeling studies of auditory pattern processing. For the first two assumptions - that neural responses are shaped by expectations and that these expectations are hierarchically organized - animal and human studies provide compelling evidence. The anticipatory, predictive nature of these expectations also enjoys empirical support, especially from studies on unexpected stimulus omission. However, for the existence of separate error and prediction neurons, a key assumption of the theory, evidence is lacking. More work exists on the proposed oscillatory signatures of predictive coding, and on the relation between attention and precision. However, results on these latter two assumptions are mixed or contradictory. Looking to the future, more collaboration between human and animal studies, aided by model-based analyses will be needed to test specific assumptions and implementations of predictive coding - and, as such, help determine whether this popular grand theory can fulfill its expectations. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Self-complementary circular codes in coding theory.
Fimmel, Elena; Michel, Christian J; Starman, Martin; Strüngmann, Lutz
2018-04-01
Self-complementary circular codes are involved in pairing genetic processes. A maximal [Formula: see text] self-complementary circular code X of trinucleotides was identified in genes of bacteria, archaea, eukaryotes, plasmids and viruses (Michel in Life 7(20):1-16 2017, J Theor Biol 380:156-177, 2015; Arquès and Michel in J Theor Biol 182:45-58 1996). In this paper, self-complementary circular codes are investigated using the graph theory approach recently formulated in Fimmel et al. (Philos Trans R Soc A 374:20150058, 2016). A directed graph [Formula: see text] associated with any code X mirrors the properties of the code. In the present paper, we demonstrate a necessary condition for the self-complementarity of an arbitrary code X in terms of the graph theory. The same condition has been proven to be sufficient for codes which are circular and of large size [Formula: see text] trinucleotides, in particular for maximal circular codes ([Formula: see text] trinucleotides). For codes of small-size [Formula: see text] trinucleotides, some very rare counterexamples have been constructed. Furthermore, the length and the structure of the longest paths in the graphs associated with the self-complementary circular codes are investigated. It has been proven that the longest paths in such graphs determine the reading frame for the self-complementary circular codes. By applying this result, the reading frame in any arbitrary sequence of trinucleotides is retrieved after at most 15 nucleotides, i.e., 5 consecutive trinucleotides, from the circular code X identified in genes. Thus, an X motif of a length of at least 15 nucleotides in an arbitrary sequence of trinucleotides (not necessarily all of them belonging to X) uniquely defines the reading (correct) frame, an important criterion for analyzing the X motifs in genes in the future.
The trellis complexity of convolutional codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Lin, W.
1995-01-01
It has long been known that convolutional codes have a natural, regular trellis structure that facilitates the implementation of Viterbi's algorithm. It has gradually become apparent that linear block codes also have a natural, though not in general a regular, 'minimal' trellis structure, which allows them to be decoded with a Viterbi-like algorithm. In both cases, the complexity of the Viterbi decoding algorithm can be accurately estimated by the number of trellis edges per encoded bit. It would, therefore, appear that we are in a good position to make a fair comparison of the Viterbi decoding complexity of block and convolutional codes. Unfortunately, however, this comparison is somewhat muddled by the fact that some convolutional codes, the punctured convolutional codes, are known to have trellis representations that are significantly less complex than the conventional trellis. In other words, the conventional trellis representation for a convolutional code may not be the minimal trellis representation. Thus, ironically, at present we seem to know more about the minimal trellis representation for block than for convolutional codes. In this article, we provide a remedy, by developing a theory of minimal trellises for convolutional codes. (A similar theory has recently been given by Sidorenko and Zyablov). This allows us to make a direct performance-complexity comparison for block and convolutional codes. A by-product of our work is an algorithm for choosing, from among all generator matrices for a given convolutional code, what we call a trellis-minimal generator matrix, from which the minimal trellis for the code can be directly constructed. Another by-product is that, in the new theory, punctured convolutional codes no longer appear as a special class, but simply as high-rate convolutional codes whose trellis complexity is unexpectedly small.
An object oriented code for simulating supersymmetric Yang-Mills theories
NASA Astrophysics Data System (ADS)
Catterall, Simon; Joseph, Anosh
2012-06-01
We present SUSY_LATTICE - a C++ program that can be used to simulate certain classes of supersymmetric Yang-Mills (SYM) theories, including the well known N=4 SYM in four dimensions, on a flat Euclidean space-time lattice. Discretization of SYM theories is an old problem in lattice field theory. It has resisted solution until recently when new ideas drawn from orbifold constructions and topological field theories have been brought to bear on the question. The result has been the creation of a new class of lattice gauge theories in which the lattice action is invariant under one or more supersymmetries. The resultant theories are local, free of doublers and also possess exact gauge-invariance. In principle they form the basis for a truly non-perturbative definition of the continuum SYM theories. In the continuum limit they reproduce versions of the SYM theories formulated in terms of twisted fields, which on a flat space-time is just a change of the field variables. In this paper, we briefly review these ideas and then go on to provide the details of the C++ code. We sketch the design of the code, with particular emphasis being placed on SYM theories with N=(2,2) in two dimensions and N=4 in three and four dimensions, making one-to-one comparisons between the essential components of the SYM theories and their corresponding counterparts appearing in the simulation code. The code may be used to compute several quantities associated with the SYM theories such as the Polyakov loop, mean energy, and the width of the scalar eigenvalue distributions. Program summaryProgram title: SUSY_LATTICE Catalogue identifier: AELS_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AELS_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 9315 No. of bytes in distributed program, including test data, etc.: 95 371 Distribution format: tar.gz Programming language: C++ Computer: PCs and Workstations Operating system: Any, tested on Linux machines Classification:: 11.6 Nature of problem: To compute some of the observables of supersymmetric Yang-Mills theories such as supersymmetric action, Polyakov/Wilson loops, scalar eigenvalues and Pfaffian phases. Solution method: We use the Rational Hybrid Monte Carlo algorithm followed by a Leapfrog evolution and a Metropolis test. The input parameters of the model are read in from a parameter file. Restrictions: This code applies only to supersymmetric gauge theories with extended supersymmetry, which undergo the process of maximal twisting. (See Section 2 of the manuscript for details.) Running time: From a few minutes to several hours depending on the amount of statistics needed.
Revisiting Code-Switching Practice in TESOL: A Critical Perspective
ERIC Educational Resources Information Center
Wang, Hao; Mansouri, Behzad
2017-01-01
In academic circles, the "English Only" view and "Balanced view" have established their grounds after volumes of work on the topic of code-switching in TESOL. With recent development in Critical Applied Linguistics, poststructural theory, postmodern theory, and the emergence of multilingualism, scholars have begun to view ELT…
Personalisation: The Emerging "Revised" Code of Education?
ERIC Educational Resources Information Center
Hartley, David
2007-01-01
In England, a "revised" educational code appears to be emerging. It centres upon the concept of "personalisation". Its basis is less in educational theory, more in contemporary marketing theory. Personalisation can be regarded in two ways. First, it provides the rationale for a new mode of public-service delivery, one which…
Secondary School Students' Reasoning about Evolution
ERIC Educational Resources Information Center
To, Cheryl; Tenenbaum, Harriet R.; Hogh, Henriette
2017-01-01
This study examined age differences in young people's understanding of evolution theory in secondary school. A second aim of this study was to propose a new coding scheme that more accurately described students' conceptual understanding about evolutionary theory. We argue that coding schemes adopted in previous research may have overestimated…
A Dual Coding View of Vocabulary Learning
ERIC Educational Resources Information Center
Sadoski, Mark
2005-01-01
A theoretical perspective on acquiring sight vocabulary and developing meaningful vocabulary is presented. Dual Coding Theory assumes that cognition occurs in two independent but connected codes: a verbal code for language and a nonverbal code for mental imagery. The mixed research literature on using pictures in teaching sight vocabulary is…
Advanced Small Perturbation Potential Flow Theory for Unsteady Aerodynamic and Aeroelastic Analyses
NASA Technical Reports Server (NTRS)
Batina, John T.
2005-01-01
An advanced small perturbation (ASP) potential flow theory has been developed to improve upon the classical transonic small perturbation (TSP) theories that have been used in various computer codes. These computer codes are typically used for unsteady aerodynamic and aeroelastic analyses in the nonlinear transonic flight regime. The codes exploit the simplicity of stationary Cartesian meshes with the movement or deformation of the configuration under consideration incorporated into the solution algorithm through a planar surface boundary condition. The new ASP theory was developed methodically by first determining the essential elements required to produce full-potential-like solutions with a small perturbation approach on the requisite Cartesian grid. This level of accuracy required a higher-order streamwise mass flux and a mass conserving surface boundary condition. The ASP theory was further developed by determining the essential elements required to produce results that agreed well with Euler solutions. This level of accuracy required mass conserving entropy and vorticity effects, and second-order terms in the trailing wake boundary condition. Finally, an integral boundary layer procedure, applicable to both attached and shock-induced separated flows, was incorporated for viscous effects. The resulting ASP potential flow theory, including entropy, vorticity, and viscous effects, is shown to be mathematically more appropriate and computationally more accurate than the classical TSP theories. The formulaic details of the ASP theory are described fully and the improvements are demonstrated through careful comparisons with accepted alternative results and experimental data. The new theory has been used as the basis for a new computer code called ASP3D (Advanced Small Perturbation - 3D), which also is briefly described with representative results.
ERIC Educational Resources Information Center
Mayer, Richard E.; Sims, Valerie K.
1994-01-01
In 2 experiments, 162 high- and low-spatial ability students viewed a computer-generated animation and heard a concurrent or successive explanation. The concurrent group generated more creative solutions to transfer problems and demonstrated a contiguity effect consistent with dual-coding theory. (SLD)
English-Afrikaans Intrasentential Code Switching: Testing a Feature Checking Account
ERIC Educational Resources Information Center
van Dulm, Ondene
2009-01-01
The work presented here aims to account for the structure of intrasentential code switching between English and Afrikaans within the framework of feature checking theory, a theory associated with minimalist syntax. Six constructions in which verb position differs between English and Afrikaans were analysed in terms of differences in the strength…
JDFTx: Software for joint density-functional theory
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.; ...
2017-11-14
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
JDFTx: Software for joint density-functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sundararaman, Ravishankar; Letchworth-Weaver, Kendra; Schwarz, Kathleen A.
Density-functional theory (DFT) has revolutionized computational prediction of atomic-scale properties from first principles in physics, chemistry and materials science. Continuing development of new methods is necessary for accurate predictions of new classes of materials and properties, and for connecting to nano- and mesoscale properties using coarse-grained theories. JDFTx is a fully-featured open-source electronic DFT software designed specifically to facilitate rapid development of new theories, models and algorithms. Using an algebraic formulation as an abstraction layer, compact C++11 code automatically performs well on diverse hardware including GPUs (Graphics Processing Units). This code hosts the development of joint density-functional theory (JDFT) thatmore » combines electronic DFT with classical DFT and continuum models of liquids for first-principles calculations of solvated and electrochemical systems. In addition, the modular nature of the code makes it easy to extend and interface with, facilitating the development of multi-scale toolkits that connect to ab initio calculations, e.g. photo-excited carrier dynamics combining electron and phonon calculations with electromagnetic simulations.« less
Surveying multidisciplinary aspects in real-time distributed coding for Wireless Sensor Networks.
Braccini, Carlo; Davoli, Franco; Marchese, Mario; Mongelli, Maurizio
2015-01-27
Wireless Sensor Networks (WSNs), where a multiplicity of sensors observe a physical phenomenon and transmit their measurements to one or more sinks, pertain to the class of multi-terminal source and channel coding problems of Information Theory. In this category, "real-time" coding is often encountered for WSNs, referring to the problem of finding the minimum distortion (according to a given measure), under transmission power constraints, attainable by encoding and decoding functions, with stringent limits on delay and complexity. On the other hand, the Decision Theory approach seeks to determine the optimal coding/decoding strategies or some of their structural properties. Since encoder(s) and decoder(s) possess different information, though sharing a common goal, the setting here is that of Team Decision Theory. A more pragmatic vision rooted in Signal Processing consists of fixing the form of the coding strategies (e.g., to linear functions) and, consequently, finding the corresponding optimal decoding strategies and the achievable distortion, generally by applying parametric optimization techniques. All approaches have a long history of past investigations and recent results. The goal of the present paper is to provide the taxonomy of the various formulations, a survey of the vast related literature, examples from the authors' own research, and some highlights on the inter-play of the different theories.
Quantum error-correcting codes from algebraic geometry codes of Castle type
NASA Astrophysics Data System (ADS)
Munuera, Carlos; Tenório, Wanderson; Torres, Fernando
2016-10-01
We study algebraic geometry codes producing quantum error-correcting codes by the CSS construction. We pay particular attention to the family of Castle codes. We show that many of the examples known in the literature in fact belong to this family of codes. We systematize these constructions by showing the common theory that underlies all of them.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Takeoka, Masahiro; Fujiwara, Mikio; Mizuno, Jun
2004-05-01
Quantum-information theory predicts that when the transmission resource is doubled in quantum channels, the amount of information transmitted can be increased more than twice by quantum-channel coding technique, whereas the increase is at most twice in classical information theory. This remarkable feature, the superadditive quantum-coding gain, can be implemented by appropriate choices of code words and corresponding quantum decoding which requires a collective quantum measurement. Recently, an experimental demonstration was reported [M. Fujiwara et al., Phys. Rev. Lett. 90, 167906 (2003)]. The purpose of this paper is to describe our experiment in detail. Particularly, a design strategy of quantum-collective decodingmore » in physical quantum circuits is emphasized. We also address the practical implication of the gain on communication performance by introducing the quantum-classical hybrid coding scheme. We show how the superadditive quantum-coding gain, even in a small code length, can boost the communication performance of conventional coding techniques.« less
Parallel software for lattice N = 4 supersymmetric Yang-Mills theory
NASA Astrophysics Data System (ADS)
Schaich, David; DeGrand, Thomas
2015-05-01
We present new parallel software, SUSY LATTICE, for lattice studies of four-dimensional N = 4 supersymmetric Yang-Mills theory with gauge group SU(N). The lattice action is constructed to exactly preserve a single supersymmetry charge at non-zero lattice spacing, up to additional potential terms included to stabilize numerical simulations. The software evolved from the MILC code for lattice QCD, and retains a similar large-scale framework despite the different target theory. Many routines are adapted from an existing serial code (Catterall and Joseph, 2012), which SUSY LATTICE supersedes. This paper provides an overview of the new parallel software, summarizing the lattice system, describing the applications that are currently provided and explaining their basic workflow for non-experts in lattice gauge theory. We discuss the parallel performance of the code, and highlight some notable aspects of the documentation for those interested in contributing to its future development.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bose, Benjamin; Koyama, Kazuya, E-mail: benjamin.bose@port.ac.uk, E-mail: kazuya.koyama@port.ac.uk
We develop a code to produce the power spectrum in redshift space based on standard perturbation theory (SPT) at 1-loop order. The code can be applied to a wide range of modified gravity and dark energy models using a recently proposed numerical method by A.Taruya to find the SPT kernels. This includes Horndeski's theory with a general potential, which accommodates both chameleon and Vainshtein screening mechanisms and provides a non-linear extension of the effective theory of dark energy up to the third order. Focus is on a recent non-linear model of the redshift space power spectrum which has been shownmore » to model the anisotropy very well at relevant scales for the SPT framework, as well as capturing relevant non-linear effects typical of modified gravity theories. We provide consistency checks of the code against established results and elucidate its application within the light of upcoming high precision RSD data.« less
Schultz, Wolfram
2004-04-01
Neurons in a small number of brain structures detect rewards and reward-predicting stimuli and are active during the expectation of predictable food and liquid rewards. These neurons code the reward information according to basic terms of various behavioural theories that seek to explain reward-directed learning, approach behaviour and decision-making. The involved brain structures include groups of dopamine neurons, the striatum including the nucleus accumbens, the orbitofrontal cortex and the amygdala. The reward information is fed to brain structures involved in decision-making and organisation of behaviour, such as the dorsolateral prefrontal cortex and possibly the parietal cortex. The neural coding of basic reward terms derived from formal theories puts the neurophysiological investigation of reward mechanisms on firm conceptual grounds and provides neural correlates for the function of rewards in learning, approach behaviour and decision-making.
Explorations in Policy Enactment: Feminist Thought Experiments with Basil Bernstein's Code Theory
ERIC Educational Resources Information Center
Singh, Parlo; Pini, Barbara; Glasswell, Kathryn
2018-01-01
This paper builds on feminist elaborations of Bernstein's code theory to engage in a series of thought experiments with interview data produced during a co-inquiry design-based research intervention project. It presents three accounts of thinking/writing with data. Our purpose in presenting three different accounts of interview data is to…
The Impact of Causality on Information-Theoretic Source and Channel Coding Problems
ERIC Educational Resources Information Center
Palaiyanur, Harikrishna R.
2011-01-01
This thesis studies several problems in information theory where the notion of causality comes into play. Causality in information theory refers to the timing of when information is available to parties in a coding system. The first part of the thesis studies the error exponent (or reliability function) for several communication problems over…
Dual Coding Theory, Word Abstractness, and Emotion: A Critical Review of Kousta et al. (2011)
ERIC Educational Resources Information Center
Paivio, Allan
2013-01-01
Kousta, Vigliocco, Del Campo, Vinson, and Andrews (2011) questioned the adequacy of dual coding theory and the context availability model as explanations of representational and processing differences between concrete and abstract words. They proposed an alternative approach that focuses on the role of emotional content in the processing of…
One Speaker, Two Languages. Cross-Disciplinary Perspectives on Code-Switching.
ERIC Educational Resources Information Center
Milroy, Lesley, Ed.; Muysken, Pieter, Ed.
Fifteen articles review code-switching in the four major areas: policy implications in specific institutional and community settings; perspectives of social theory of code-switching as a form of speech behavior in particular social contexts; the grammatical analysis of code-switching, including factors that constrain switching even within a…
NP-hardness of decoding quantum error-correction codes
NASA Astrophysics Data System (ADS)
Hsieh, Min-Hsiu; Le Gall, François
2011-05-01
Although the theory of quantum error correction is intimately related to classical coding theory and, in particular, one can construct quantum error-correction codes (QECCs) from classical codes with the dual-containing property, this does not necessarily imply that the computational complexity of decoding QECCs is the same as their classical counterparts. Instead, decoding QECCs can be very much different from decoding classical codes due to the degeneracy property. Intuitively, one expects degeneracy would simplify the decoding since two different errors might not and need not be distinguished in order to correct them. However, we show that general quantum decoding problem is NP-hard regardless of the quantum codes being degenerate or nondegenerate. This finding implies that no considerably fast decoding algorithm exists for the general quantum decoding problems and suggests the existence of a quantum cryptosystem based on the hardness of decoding QECCs.
Coevolution Theory of the Genetic Code at Age Forty: Pathway to Translation and Synthetic Life
Wong, J. Tze-Fei; Ng, Siu-Kin; Mat, Wai-Kin; Hu, Taobo; Xue, Hong
2016-01-01
The origins of the components of genetic coding are examined in the present study. Genetic information arose from replicator induction by metabolite in accordance with the metabolic expansion law. Messenger RNA and transfer RNA stemmed from a template for binding the aminoacyl-RNA synthetase ribozymes employed to synthesize peptide prosthetic groups on RNAs in the Peptidated RNA World. Coevolution of the genetic code with amino acid biosynthesis generated tRNA paralogs that identify a last universal common ancestor (LUCA) of extant life close to Methanopyrus, which in turn points to archaeal tRNA introns as the most primitive introns and the anticodon usage of Methanopyrus as an ancient mode of wobble. The prediction of the coevolution theory of the genetic code that the code should be a mutable code has led to the isolation of optional and mandatory synthetic life forms with altered protein alphabets. PMID:26999216
Autosophy information theory provides lossless data and video compression based on the data content
NASA Astrophysics Data System (ADS)
Holtz, Klaus E.; Holtz, Eric S.; Holtz, Diana
1996-09-01
A new autosophy information theory provides an alternative to the classical Shannon information theory. Using the new theory in communication networks provides both a high degree of lossless compression and virtually unbreakable encryption codes for network security. The bandwidth in a conventional Shannon communication is determined only by the data volume and the hardware parameters, such as image size; resolution; or frame rates in television. The data content, or what is shown on the screen, is irrelevant. In contrast, the bandwidth in autosophy communication is determined only by data content, such as novelty and movement in television images. It is the data volume and hardware parameters that become irrelevant. Basically, the new communication methods use prior 'knowledge' of the data, stored in a library, to encode subsequent transmissions. The more 'knowledge' stored in the libraries, the higher the potential compression ratio. 'Information' is redefined as that which is not already known by the receiver. Everything already known is redundant and need not be re-transmitted. In a perfect communication each transmission code, called a 'tip,' creates a new 'engram' of knowledge in the library in which each tip transmission can represent any amount of data. Autosophy theories provide six separate learning modes, or omni dimensional networks, all of which can be used for data compression. The new information theory reveals the theoretical flaws of other data compression methods, including: the Huffman; Ziv Lempel; LZW codes and commercial compression codes such as V.42bis and MPEG-2.
Dynamic state estimation based on Poisson spike trains—towards a theory of optimal encoding
NASA Astrophysics Data System (ADS)
Susemihl, Alex; Meir, Ron; Opper, Manfred
2013-03-01
Neurons in the nervous system convey information to higher brain regions by the generation of spike trains. An important question in the field of computational neuroscience is how these sensory neurons encode environmental information in a way which may be simply analyzed by subsequent systems. Many aspects of the form and function of the nervous system have been understood using the concepts of optimal population coding. Most studies, however, have neglected the aspect of temporal coding. Here we address this shortcoming through a filtering theory of inhomogeneous Poisson processes. We derive exact relations for the minimal mean squared error of the optimal Bayesian filter and, by optimizing the encoder, obtain optimal codes for populations of neurons. We also show that a class of non-Markovian, smooth stimuli are amenable to the same treatment, and provide results for the filtering and prediction error which hold for a general class of stochastic processes. This sets a sound mathematical framework for a population coding theory that takes temporal aspects into account. It also formalizes a number of studies which discussed temporal aspects of coding using time-window paradigms, by stating them in terms of correlation times and firing rates. We propose that this kind of analysis allows for a systematic study of temporal coding and will bring further insights into the nature of the neural code.
Psychosocial factors and theory in physical activity studies in minorities.
Mama, Scherezade K; McNeill, Lorna H; McCurdy, Sheryl A; Evans, Alexandra E; Diamond, Pamela M; Adamus-Leach, Heather J; Lee, Rebecca E
2015-01-01
To summarize the effectiveness of interventions targeting psychosocial factors to increase physical activity (PA) among ethnic minority adults and explore theory use in PA interventions. Studies (N = 11) were identified through a systematic review and targeted African American/Hispanic adults, specific psychosocial factors, and PA. Data were extracted using a standard code sheet and the Theory Coding Scheme. Social support was the most common psychosocial factor reported, followed by motivational readiness, and self-efficacy, as being associated with increased PA. Only 7 studies explicitly reported using a theoretical framework. Future efforts should explore theory use in PA interventions and how integration of theoretical constructs, including psychosocial factors, increases PA.
ERIC Educational Resources Information Center
Alty, James L.
Dual Coding Theory has quite specific predictions about how information in different media is stored, manipulated and recalled. Different combinations of media are expected to have significant effects upon the recall and retention of information. This obviously may have important consequences in the design of computer-based programs. The paper…
ERIC Educational Resources Information Center
Asarta, Carlos J.
2016-01-01
Carlos Asarta comments here that Arbaugh, Fornaciari, and Hwang (2016) are to be commended for their work ("Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory" "Journal of Management Education," Dec 2016, see EJ1118407). Asarta says that they make several…
L1 and L2 Picture Naming in Mandarin-English Bilinguals: A Test of Bilingual Dual Coding Theory
ERIC Educational Resources Information Center
Jared, Debra; Poh, Rebecca Pei Yun; Paivio, Allan
2013-01-01
This study examined the nature of bilinguals' conceptual representations and the links from these representations to words in L1 and L2. Specifically, we tested an assumption of the Bilingual Dual Coding Theory that conceptual representations include image representations, and that learning two languages in separate contexts can result in…
ERIC Educational Resources Information Center
Cole, Charles; Mandelblatt, Bertie
2000-01-01
Uses Kintsch's proposition-based construction-integration theory of discourse comprehension to detail the user coding operations that occur in each of the three subsystems (Perception, Comprehension, Application) in which users process an information retrieval systems (IRS) message. Describes an IRS device made up of two separate parts that enable…
Convolutional coding combined with continuous phase modulation
NASA Technical Reports Server (NTRS)
Pizzi, S. V.; Wilson, S. G.
1985-01-01
Background theory and specific coding designs for combined coding/modulation schemes utilizing convolutional codes and continuous-phase modulation (CPM) are presented. In this paper the case of r = 1/2 coding onto a 4-ary CPM is emphasized, with short-constraint length codes presented for continuous-phase FSK, double-raised-cosine, and triple-raised-cosine modulation. Coding buys several decibels of coding gain over the Gaussian channel, with an attendant increase of bandwidth. Performance comparisons in the power-bandwidth tradeoff with other approaches are made.
Mathematical fundamentals for the noise immunity of the genetic code.
Fimmel, Elena; Strüngmann, Lutz
2018-02-01
Symmetry is one of the essential and most visible patterns that can be seen in nature. Starting from the left-right symmetry of the human body, all types of symmetry can be found in crystals, plants, animals and nature as a whole. Similarly, principals of symmetry are also some of the fundamental and most useful tools in modern mathematical natural science that play a major role in theory and applications. As a consequence, it is not surprising that the desire to understand the origin of life, based on the genetic code, forces us to involve symmetry as a mathematical concept. The genetic code can be seen as a key to biological self-organisation. All living organisms have the same molecular bases - an alphabet consisting of four letters (nitrogenous bases): adenine, cytosine, guanine, and thymine. Linearly ordered sequences of these bases contain the genetic information for synthesis of proteins in all forms of life. Thus, one of the most fascinating riddles of nature is to explain why the genetic code is as it is. Genetic coding possesses noise immunity which is the fundamental feature that allows to pass on the genetic information from parents to their descendants. Hence, since the time of the discovery of the genetic code, scientists have tried to explain the noise immunity of the genetic information. In this chapter we will discuss recent results in mathematical modelling of the genetic code with respect to noise immunity, in particular error-detection and error-correction. We will focus on two central properties: Degeneracy and frameshift correction. Different amino acids are encoded by different quantities of codons and a connection between this degeneracy and the noise immunity of genetic information is a long standing hypothesis. Biological implications of the degeneracy have been intensively studied and whether the natural code is a frozen accident or a highly optimised product of evolution is still controversially discussed. Symmetries in the structure of degeneracy of the genetic code are essential and give evidence of substantial advantages of the natural code over other possible ones. In the present chapter we will present a recent approach to explain the degeneracy of the genetic code by algorithmic methods from bioinformatics, and discuss its biological consequences. The biologists recognised this problem immediately after the detection of the non-overlapping structure of the genetic code, i.e., coding sequences are to be read in a unique way determined by their reading frame. But how does the reading head of the ribosome recognises an error in the grouping of codons, caused by e.g. insertion or deletion of a base, that can be fatal during the translation process and may result in nonfunctional proteins? In this chapter we will discuss possible solutions to the frameshift problem with a focus on the theory of so-called circular codes that were discovered in large gene populations of prokaryotes and eukaryotes in the early 90s. Circular codes allow to detect a frameshift of one or two positions and recently a beautiful theory of such codes has been developed using statistics, group theory and graph theory. Copyright © 2017 Elsevier B.V. All rights reserved.
Aids to Computer-Based Multimedia Learning.
ERIC Educational Resources Information Center
Mayer, Richard E.; Moreno, Roxana
2002-01-01
Presents a cognitive theory of multimedia learning that draws on dual coding theory, cognitive load theory, and constructivist learning theory and derives some principles of instructional design for fostering multimedia learning. These include principles of multiple representation, contiguity, coherence, modality, and redundancy. (SLD)
ERIC Educational Resources Information Center
Antonacopoulou, Elena P.
2016-01-01
In "Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory," authors J.B. Arbaugh, Charles J. Fornaciari, and Alvin Hwang ("Journal of Management Education," December 2016 vol. 40 no. 6 p654-691, see EJ1118407) used citation analysis to track the development of…
NASA Technical Reports Server (NTRS)
1972-01-01
Here, the 7400 line of transistor to transistor logic (TTL) devices is emphasized almost exclusively where hardware is concerned. However, it should be pointed out that the logic theory contained herein applies to all hardware. Binary numbers, simplification of logic circuits, code conversion circuits, basic flip-flop theory, details about series 54/7400, and asynchronous circuits are discussed.
ERIC Educational Resources Information Center
Bacon, Donald R.
2016-01-01
In this rejoinder to "Identifying Research Topic Development in Business and Management Education Research Using Legitimation Code Theory," published in the "Journal of Management Education," Dec 2016 (see EJ1118407), Donald R. Bacon discusses the similarities between Arbaugh et al.'s (2016) findings and the scholarship…
Degenerate quantum codes and the quantum Hamming bound
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarvepalli, Pradeep; Klappenecker, Andreas
2010-03-15
The parameters of a nondegenerate quantum code must obey the Hamming bound. An important open problem in quantum coding theory is whether the parameters of a degenerate quantum code can violate this bound for nondegenerate quantum codes. In this article we show that Calderbank-Shor-Steane (CSS) codes, over a prime power alphabet q{>=}5, cannot beat the quantum Hamming bound. We prove a quantum version of the Griesmer bound for the CSS codes, which allows us to strengthen the Rains' bound that an [[n,k,d
Processing Motion: Using Code to Teach Newtonian Physics
NASA Astrophysics Data System (ADS)
Massey, M. Ryan
Prior to instruction, students often possess a common-sense view of motion, which is inconsistent with Newtonian physics. Effective physics lessons therefore involve conceptual change. To provide a theoretical explanation for concepts and how they change, the triangulation model brings together key attributes of prototypes, exemplars, theories, Bayesian learning, ontological categories, and the causal model theory. The triangulation model provides a theoretical rationale for why coding is a viable method for physics instruction. As an experiment, thirty-two adolescent students participated in summer coding academies to learn how to design Newtonian simulations. Conceptual and attitudinal data was collected using the Force Concept Inventory and the Colorado Learning Attitudes about Science Survey. Results suggest that coding is an effective means for teaching Newtonian physics.
Signal-processing theory for the TurboRogue receiver
NASA Technical Reports Server (NTRS)
Thomas, J. B.
1995-01-01
Signal-processing theory for the TurboRogue receiver is presented. The signal form is traced from its formation at the GPS satellite, to the receiver antenna, and then through the various stages of the receiver, including extraction of phase and delay. The analysis treats the effects of ionosphere, troposphere, signal quantization, receiver components, and system noise, covering processing in both the 'code mode' when the P code is not encrypted and in the 'P-codeless mode' when the P code is encrypted. As a possible future improvement to the current analog front end, an example of a highly digital front end is analyzed.
Psychosocial Factors and Theory in Physical Activity Studies in Minorities
Mama, Scherezade K.; McNeill, Lorna H.; McCurdy, Sheryl A.; Evans, Alexandra E.; Diamond, Pamela M.; Adamus-Leach, Heather J.; Lee, Rebecca E.
2015-01-01
Objectives To summarize the effectiveness of interventions targeting psychosocial factors to increase physical activity (PA) among ethnic minority adults and explore theory use in PA interventions. Methods Studies (N = 11) were identified through a systematic review and targeted African American/Hispanic adults, specific psychosocial factors, and PA. Data were extracted using a standard code sheet and the Theory Coding Scheme. Results Social support was the most common psychosocial factor reported, followed by motivational readiness, and self-efficacy, as being associated with increased PA. Only 7 studies explicitly reported using a theoretical framework. Conclusions Future efforts should explore theory use in PA interventions and how integration of theoretical constructs, including psychosocial factors, increases PA. PMID:25290599
Jiang, Zhehan; Skorupski, William
2017-12-12
In many behavioral research areas, multivariate generalizability theory (mG theory) has been typically used to investigate the reliability of certain multidimensional assessments. However, traditional mG-theory estimation-namely, using frequentist approaches-has limits, leading researchers to fail to take full advantage of the information that mG theory can offer regarding the reliability of measurements. Alternatively, Bayesian methods provide more information than frequentist approaches can offer. This article presents instructional guidelines on how to implement mG-theory analyses in a Bayesian framework; in particular, BUGS code is presented to fit commonly seen designs from mG theory, including single-facet designs, two-facet crossed designs, and two-facet nested designs. In addition to concrete examples that are closely related to the selected designs and the corresponding BUGS code, a simulated dataset is provided to demonstrate the utility and advantages of the Bayesian approach. This article is intended to serve as a tutorial reference for applied researchers and methodologists conducting mG-theory studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brier, Soren; Joslyn, Cliff A.
2013-04-01
This paper presents a critical analysis of code-semiotics, which we see as the latest attempt to create paradigmatic foundation for solving the question of the emergence of life and consciousness. We view code semiotics as a an attempt to revise the empirical scientific Darwinian paradigm, and to go beyond the complex systems, emergence, self-organization, and informational paradigms, and also the selfish gene theory of Dawkins and the Peircean pragmaticist semiotic theory built on the simultaneous types of evolution. As such it is a new and bold attempt to use semiotics to solve the problems created by the evolutionary paradigm’s commitmentmore » to produce a theory of how to connect the two sides of the Cartesian dualistic view of physical reality and consciousness in a consistent way.« less
The evolution of the genetic code: Impasses and challenges.
Kun, Ádám; Radványi, Ádám
2018-02-01
The origin of the genetic code and translation is a "notoriously difficult problem". In this survey we present a list of questions that a full theory of the genetic code needs to answer. We assess the leading hypotheses according to these criteria. The stereochemical, the coding coenzyme handle, the coevolution, the four-column theory, the error minimization and the frozen accident hypotheses are discussed. The integration of these hypotheses can account for the origin of the genetic code. But experiments are badly needed. Thus we suggest a host of experiments that could (in)validate some of the models. We focus especially on the coding coenzyme handle hypothesis (CCH). The CCH suggests that amino acids attached to RNA handles enhanced catalytic activities of ribozymes. Alternatively, amino acids without handles or with a handle consisting of a single adenine, like in contemporary coenzymes could have been employed. All three scenarios can be tested in in vitro compartmentalized systems. Copyright © 2017 Elsevier B.V. All rights reserved.
Transport and equilibrium in field-reversed mirrors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyd, J.K.
Two plasma models relevant to compact torus research have been developed to study transport and equilibrium in field reversed mirrors. In the first model for small Larmor radius and large collision frequency, the plasma is described as an adiabatic hydromagnetic fluid. In the second model for large Larmor radius and small collision frequency, a kinetic theory description has been developed. Various aspects of the two models have been studied in five computer codes ADB, AV, NEO, OHK, RES. The ADB code computes two dimensional equilibrium and one dimensional transport in a flux coordinate. The AV code calculates orbit average integralsmore » in a harmonic oscillator potential. The NEO code follows particle trajectories in a Hill's vortex magnetic field to study stochasticity, invariants of the motion, and orbit average formulas. The OHK code displays analytic psi(r), B/sub Z/(r), phi(r), E/sub r/(r) formulas developed for the kinetic theory description. The RES code calculates resonance curves to consider overlap regions relevant to stochastic orbit behavior.« less
Toward a New Theory for Selecting Instructional Visuals.
ERIC Educational Resources Information Center
Croft, Richard S.; Burton, John K.
This paper provides a rationale for the selection of illustrations and visual aids for the classroom. The theories that describe the processing of visuals are dual coding theory and cue summation theory. Concept attainment theory offers a basis for selecting which cues are relevant for any learning task which includes a component of identification…
Using Program Theory-Driven Evaluation Science to Crack the Da Vinci Code
ERIC Educational Resources Information Center
Donaldson, Stewart I.
2005-01-01
Program theory-driven evaluation science uses substantive knowledge, as opposed to method proclivities, to guide program evaluations. It aspires to update, clarify, simplify, and make more accessible the evolving theory of evaluation practice commonly referred to as theory-driven or theory-based evaluation. The evaluator in this chapter provides a…
Dissociation between awareness and spatial coding: evidence from unilateral neglect.
Treccani, Barbara; Cubelli, Roberto; Sellaro, Roberta; Umiltà, Carlo; Della Sala, Sergio
2012-04-01
Prevalent theories about consciousness propose a causal relation between lack of spatial coding and absence of conscious experience: The failure to code the position of an object is assumed to prevent this object from entering consciousness. This is consistent with influential theories of unilateral neglect following brain damage, according to which spatial coding of neglected stimuli is defective, and this would keep their processing at the nonconscious level. Contrary to this view, we report evidence showing that spatial coding and consciousness can dissociate. A patient with left neglect, who was not aware of contralesional stimuli, was able to process their color and position. However, in contrast to (ipsilesional) consciously perceived stimuli, color and position of neglected stimuli were processed separately. We propose that individual object features, including position, can be processed without attention and consciousness and that conscious perception of an object depends on the binding of its features into an integrated percept.
NASA Technical Reports Server (NTRS)
Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.
2008-01-01
Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.
Nonisentropic unsteady three dimensional small disturbance potential theory
NASA Technical Reports Server (NTRS)
Gibbons, M. D.; Whitlow, W., Jr.; Williams, M. H.
1986-01-01
Modifications that allow for more accurate modeling of flow fields when strong shocks are present were made into three dimensional transonic small disturbance (TSD) potential theory. The Engquist-Osher type-dependent differencing was incorporated into the solution algorithm. The modified theory was implemented in the XTRAN3S computer code. Steady flows over a rectangular wing with a constant NACA 0012 airfoil section and an aspect ratio of 12 were calculated for freestream Mach numbers (M) of 0.82, 0.84, and 0.86. The obtained results are compared using the modified and unmodified TSD theories and the results from a three dimensional Euler code are presented. Nonunique solutions in three dimensions are shown to appear for the rectangular wing as aspect ratio increases. Steady and unsteady results are shown for the RAE tailplane model at M = 0.90. Calculations using unmodified theory, modified theory and experimental data are compared.
An Introduction to Quantum Theory
NASA Astrophysics Data System (ADS)
Greensite, Jeff
2017-02-01
Written in a lucid and engaging style, the author takes readers from an overview of classical mechanics and the historical development of quantum theory through to advanced topics. The mathematical aspects of quantum theory necessary for a firm grasp of the subject are developed in the early chapters, but an effort is made to motivate that formalism on physical grounds. Including animated figures and their respective Mathematica® codes, this book provides a complete and comprehensive text for students in physics, maths, chemistry and engineering needing an accessible introduction to quantum mechanics. Supplementary Mathematica codes available within Book Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, J.; Peebles, W. A.; Crocker, N. A.
Mueller-Stokes theory can be used to calculate the polarization evolution of an electromagnetic (EM) wave as it propagates through a magnetized plasma. Historically, the theory has been used to interpret polarimeter signals from systems operating on fusion plasmas. These interpretations have mostly employed approximations of Mueller-Stokes theory in regimes where either the Faraday rotation (FR) or the Cotton-Mouton (CM) effect is dominant. The current paper presents the first systematic comparison of polarimeter measurements with the predictions of full Mueller-Stokes theory where conditions transition smoothly from a FR-dominant (i.e., weak CM effect) plasma to one where the CM effect plays amore » significant role. A synthetic diagnostic code, based on Mueller-Stokes theory accurately reproduces the trends evident in the experimentally measured polarimeter phase over this entire operating range, thereby validating Mueller-Stokes theory. The synthetic diagnostic code is then used to investigate the influence of the CM effect on polarimetry measurements. As expected, the measurements are well approximated by the FR effect when the CM effect is predicted to be weak. However, the code shows that as the CM effect increases, it can compete with the FR effect in rotating the polarization of the EM-wave. This results in a reduced polarimeter response to the FR effect, just as observed in the experiment. The code also shows if sufficiently large, the CM effect can even reverse the handedness of a wave launched with circular polarization. This helps to understand the surprising experimental observations that the sensitivity to the FR effect can be nearly eliminated at high enough B{sub T} (2.0 T). The results also suggest that the CM effect on the plasma midplane can be exploited to potentially measure magnetic shear in tokamak plasmas. These results establish increased confidence in the use of such a synthetic diagnostic code to guide future polarimetry design and interpret the resultant experimental data.« less
Error Correcting Codes and Related Designs
1990-09-30
Theory, IT-37 (1991), 1222-1224. 6. Codes and designs, existence and uniqueness, Discrete Math ., to appear. 7. (with R. Brualdi and N. Cai), Orphan...structure of the first order Reed-Muller codes, Discrete Math ., to appear. 8. (with J. H. Conway and N.J.A. Sloane), The binary self-dual codes of length up...18, 1988. 4. "Codes and Designs," Mathematics Colloquium, Technion, Haifa, Israel, March 6, 1989. 5. "On the Covering Radius of Codes," Discrete Math . Group
The design of dual-mode complex signal processors based on quadratic modular number codes
NASA Astrophysics Data System (ADS)
Jenkins, W. K.; Krogmeier, J. V.
1987-04-01
It has been known for a long time that quadratic modular number codes admit an unusual representation of complex numbers which leads to complete decoupling of the real and imaginary channels, thereby simplifying complex multiplication and providing error isolation between the real and imaginary channels. This paper first presents a tutorial review of the theory behind the different types of complex modular rings (fields) that result from particular parameter selections, and then presents a theory for a 'dual-mode' complex signal processor based on the choice of augmented power-of-2 moduli. It is shown how a diminished-1 binary code, used by previous designers for the realization of Fermat number transforms, also leads to efficient realizations for dual-mode complex arithmetic for certain augmented power-of-2 moduli. Then a design is presented for a recursive complex filter based on a ROM/ACCUMULATOR architecture and realized in an augmented power-of-2 quadratic code, and a computer-generated example of a complex recursive filter is shown to illustrate the principles of the theory.
Puzzling the Picture Using Grounded Theory
ERIC Educational Resources Information Center
Bennett, Elisabeth E.
2016-01-01
Since the first publication by Glaser and Strauss in 1967, Grounded Theory has become a highly influential research approach in the social sciences. The approach provides techniques and coding strategies for building theory inductively from the "ground up" as concepts within the data earn relevance into an evolving substantive theory.…
NASA Technical Reports Server (NTRS)
Hanson, Donald B.
1994-01-01
A two dimensional linear aeroacoustic theory for rotor/stator interaction with unsteady coupling was derived and explored in Volume 1 of this report. Computer program CUP2D has been written in FORTRAN embodying the theoretical equations. This volume (Volume 2) describes the structure of the code, installation and running, preparation of the input file, and interpretation of the output. A sample case is provided with printouts of the input and output. The source code is included with comments linking it closely to the theoretical equations in Volume 1.
Processing of visually presented clock times.
Goolkasian, P; Park, D C
1980-11-01
The encoding and representation of visually presented clock times was investigated in three experiments utilizing a comparative judgment task. Experiment 1 explored the effects of comparing times presented in different formats (clock face, digit, or word), and Experiment 2 examined angular distance effects created by varying positions of the hands on clock faces. In Experiment 3, encoding and processing differences between clock faces and digitally presented times were directly measured. Same/different reactions to digitally presented times were faster than to times presented on a clock face, and this format effect was found to be a result of differences in processing that occurred after encoding. Angular separation also had a limited effect on processing. The findings are interpreted within the framework of theories that refer to the importance of representational codes. The applicability to the data of Bank's semantic-coding theory, Paivio's dual-coding theory, and the levels-of-processing view of memory are discussed.
Life is physics and chemistry and communication.
Witzany, Guenther
2015-04-01
Manfred Eigen extended Erwin Schroedinger's concept of "life is physics and chemistry" through the introduction of information theory and cybernetic systems theory into "life is physics and chemistry and information." Based on this assumption, Eigen developed the concepts of quasispecies and hypercycles, which have been dominant in molecular biology and virology ever since. He insisted that the genetic code is not just used metaphorically: it represents a real natural language. However, the basics of scientific knowledge changed dramatically within the second half of the 20th century. Unfortunately, Eigen ignored the results of the philosophy of science discourse on essential features of natural languages and codes: a natural language or code emerges from populations of living agents that communicate. This contribution will look at some of the highlights of this historical development and the results relevant for biological theories about life. © 2014 New York Academy of Sciences.
Entropy-Based Bounds On Redundancies Of Huffman Codes
NASA Technical Reports Server (NTRS)
Smyth, Padhraic J.
1992-01-01
Report presents extension of theory of redundancy of binary prefix code of Huffman type which includes derivation of variety of bounds expressed in terms of entropy of source and size of alphabet. Recent developments yielded bounds on redundancy of Huffman code in terms of probabilities of various components in source alphabet. In practice, redundancies of optimal prefix codes often closer to 0 than to 1.
Santos, José; Monteagudo, Ángel
2017-03-27
The canonical code, although prevailing in complex genomes, is not universal. It was shown the canonical genetic code superior robustness compared to random codes, but it is not clearly determined how it evolved towards its current form. The error minimization theory considers the minimization of point mutation adverse effect as the main selection factor in the evolution of the code. We have used simulated evolution in a computer to search for optimized codes, which helps to obtain information about the optimization level of the canonical code in its evolution. A genetic algorithm searches for efficient codes in a fitness landscape that corresponds with the adaptability of possible hypothetical genetic codes. The lower the effects of errors or mutations in the codon bases of a hypothetical code, the more efficient or optimal is that code. The inclusion of the fitness sharing technique in the evolutionary algorithm allows the extent to which the canonical genetic code is in an area corresponding to a deep local minimum to be easily determined, even in the high dimensional spaces considered. The analyses show that the canonical code is not in a deep local minimum and that the fitness landscape is not a multimodal fitness landscape with deep and separated peaks. Moreover, the canonical code is clearly far away from the areas of higher fitness in the landscape. Given the non-presence of deep local minima in the landscape, although the code could evolve and different forces could shape its structure, the fitness landscape nature considered in the error minimization theory does not explain why the canonical code ended its evolution in a location which is not an area of a localized deep minimum of the huge fitness landscape.
Implications of Information Theory for Computational Modeling of Schizophrenia.
Silverstein, Steven M; Wibral, Michael; Phillips, William A
2017-10-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory-such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio-can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development.
Some partial-unit-memory convolutional codes
NASA Technical Reports Server (NTRS)
Abdel-Ghaffar, K.; Mceliece, R. J.; Solomon, G.
1991-01-01
The results of a study on a class of error correcting codes called partial unit memory (PUM) codes are presented. This class of codes, though not entirely new, has until now remained relatively unexplored. The possibility of using the well developed theory of block codes to construct a large family of promising PUM codes is shown. The performance of several specific PUM codes are compared with that of the Voyager standard (2, 1, 6) convolutional code. It was found that these codes can outperform the Voyager code with little or no increase in decoder complexity. This suggests that there may very well be PUM codes that can be used for deep space telemetry that offer both increased performance and decreased implementational complexity over current coding systems.
hi-class: Horndeski in the Cosmic Linear Anisotropy Solving System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zumalacárregui, Miguel; Bellini, Emilio; Sawicki, Ignacy
We present the public version of hi-class (www.hiclass-code.net), an extension of the Boltzmann code CLASS to a broad ensemble of modifications to general relativity. In particular, hi-class can calculate predictions for models based on Horndeski's theory, which is the most general scalar-tensor theory described by second-order equations of motion and encompasses any perfect-fluid dark energy, quintessence, Brans-Dicke, f ( R ) and covariant Galileon models. hi-class has been thoroughly tested and can be readily used to understand the impact of alternative theories of gravity on linear structure formation as well as for cosmological parameter extraction.
Ross, Jaclyn M; Girard, Jeffrey M; Wright, Aidan G C; Beeney, Joseph E; Scott, Lori N; Hallquist, Michael N; Lazarus, Sophie A; Stepp, Stephanie D; Pilkonis, Paul A
2017-02-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory's principle of complementarity. Thus, findings reveal points of convergence and divergence in the 2 systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Exploring the Lived Experiences of Participants in Simulation-Based Learning Activities
ERIC Educational Resources Information Center
Beard, Rachael
2013-01-01
There is currently a small body of research on the experiences of participants, both facilitators and learners, during simulated mock codes (cardiac arrest) in the healthcare setting. This study was based on a practitioner's concerns that mock codes are facilitated differently among educators, mock codes are not aligned with andragogy theory of…
Tracking Holland Interest Codes: The Case of South African Field Guides
ERIC Educational Resources Information Center
Watson, Mark B.; Foxcroft, Cheryl D.; Allen, Lynda J.
2007-01-01
Holland believes that specific personality types seek out matching occupational environments and his theory codes personality and environment according to a six letter interest typology. Since 1985 there have been numerous American studies that have queried the validity of Holland's coding system. Research in South Africa is scarcer, despite…
Monte Carol-based validation of neutronic methodology for EBR-II analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liaw, J.R.; Finck, P.J.
1993-01-01
The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less
Numerical Methods for Analysis of Charged Vacancy Diffusion in Dielectric Solids
2006-12-01
theory for charged vacancy diffusion in elastic dielectric materials is formulated and implemented numerically in a finite difference code. The...one of the co-authors on neutral vacancy kinetics (Grinfeld and Hazzledine, 1997). The theory is implemented numerically in a finite difference code...accuracy of order ( )2x∆ , using a finite difference approximation (Hoffman, 1992) for the second spatial derivative of φ : ( )21 1 0ˆ2 /i i i i Rxφ
The neuromodulator of exploration: A unifying theory of the role of dopamine in personality
DeYoung, Colin G.
2013-01-01
The neuromodulator dopamine is centrally involved in reward, approach behavior, exploration, and various aspects of cognition. Variations in dopaminergic function appear to be associated with variations in personality, but exactly which traits are influenced by dopamine remains an open question. This paper proposes a theory of the role of dopamine in personality that organizes and explains the diversity of findings, utilizing the division of the dopaminergic system into value coding and salience coding neurons (Bromberg-Martin et al., 2010). The value coding system is proposed to be related primarily to Extraversion and the salience coding system to Openness/Intellect. Global levels of dopamine influence the higher order personality factor, Plasticity, which comprises the shared variance of Extraversion and Openness/Intellect. All other traits related to dopamine are linked to Plasticity or its subtraits. The general function of dopamine is to promote exploration, by facilitating engagement with cues of specific reward (value) and cues of the reward value of information (salience). This theory constitutes an extension of the entropy model of uncertainty (EMU; Hirsh et al., 2012), enabling EMU to account for the fact that uncertainty is an innate incentive reward as well as an innate threat. The theory accounts for the association of dopamine with traits ranging from sensation and novelty seeking, to impulsivity and aggression, to achievement striving, creativity, and cognitive abilities, to the overinclusive thinking characteristic of schizotypy. PMID:24294198
Phonological coding during reading.
Leinenger, Mallorie
2014-11-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Phonological coding during reading
Leinenger, Mallorie
2014-01-01
The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679
NASA Technical Reports Server (NTRS)
Breedlove, W. J., Jr.
1976-01-01
Major activities included coding and verifying equations of motion for the earth-moon system. Some attention was also given to numerical integration methods and parameter estimation methods. Existing analytical theories such as Brown's lunar theory, Eckhardt's theory for lunar rotation, and Newcomb's theory for the rotation of the earth were coded and verified. These theories serve as checks for the numerical integration. Laser ranging data for the period January 1969 - December 1975 was collected and stored on tape. The main goal of this research is the development of software to enable physical parameters of the earth-moon system to be estimated making use of data available from the Lunar Laser Ranging Experiment and the Very Long Base Interferometry experiment of project Apollo. A more specific goal is to develop software for the estimation of certain physical parameters of the moon such as inertia ratios, and the third and fourth harmonic gravity coefficients.
Charmaz, Kathy
2015-12-01
This article addresses criticisms of qualitative research for spawning studies that lack analytic development and theoretical import. It focuses on teaching initial grounded theory tools while interviewing, coding, and writing memos for the purpose of scaling up the analytic level of students' research and advancing theory construction. Adopting these tools can improve teaching qualitative methods at all levels although doctoral education is emphasized here. What teachers cover in qualitative methods courses matters. The pedagogy presented here requires a supportive environment and relies on demonstration, collective participation, measured tasks, progressive analytic complexity, and accountability. Lessons learned from using initial grounded theory tools are exemplified in a doctoral student's coding and memo-writing excerpts that demonstrate progressive analytic development. The conclusion calls for increasing the number and depth of qualitative methods courses and for creating a cadre of expert qualitative methodologists. © The Author(s) 2015.
ERIC Educational Resources Information Center
Moses, Annie M.; Golos, Debbie B.; Bennett, Colleen M.
2015-01-01
Early childhood educators need access to research-based practices and materials to help all children learn to read. Some theorists have suggested that individuals learn to read through "dual coding" (i.e., a verbal code and a nonverbal code) and may benefit from more than one route to literacy (e.g., dual coding theory). Although deaf…
Entanglement-assisted quantum convolutional coding
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilde, Mark M.; Brun, Todd A.
2010-04-15
We show how to protect a stream of quantum information from decoherence induced by a noisy quantum communication channel. We exploit preshared entanglement and a convolutional coding structure to develop a theory of entanglement-assisted quantum convolutional coding. Our construction produces a Calderbank-Shor-Steane (CSS) entanglement-assisted quantum convolutional code from two arbitrary classical binary convolutional codes. The rate and error-correcting properties of the classical convolutional codes directly determine the corresponding properties of the resulting entanglement-assisted quantum convolutional code. We explain how to encode our CSS entanglement-assisted quantum convolutional codes starting from a stream of information qubits, ancilla qubits, and shared entangled bits.
Control and System Theory, Optimization, Inverse and Ill-Posed Problems
1988-09-14
Justlfleatlen Distribut ion/ Availability Codes # AFOSR-87-0350 Avat’ and/or1987-1988 Dist Special *CONTROL AND SYSTEM THEORY , ~ * OPTIMIZATION, * INVERSE...considerable va- riety of research investigations within the grant areas (Control and system theory , Optimization, and Ill-posed problems]. The
Theoretical research on color indirect effects
NASA Astrophysics Data System (ADS)
Liu, T. C.; Liao, Changjun; Liu, Songhao
1995-05-01
Color indirect effects (CIE) means the physiological and psychological effects of color resulting from color vision. In this paper, we study CIE from the viewpoints of the integrated western and Chinese traditional medicine and the time quantum theory established by C. Y. Liu et al., respectively, and then put forward the color-automatic-nervous-subsystem model that could color excites parasympathetic subsystem and hot color excites sympathetic subsystem. Our theory is in agreement with modern color vision theory, and moreover, it leads to the resolution of the conflict between the color code theory and the time code theory oncolor vision. For the latitude phenomena on athlete stars number and the average lifespan, we also discuss the possibility of UV vision. The applications of our theory lead to our succeeding in explaining a number of physiological and psychological effects of color, in explaining the effects of age on color vision, and in explaining the Chinese chromophototherapy. We also discuss its application to neuroimmunology. This research provides the foundation of the clinical applications of chromophototherapy.
Jarrott, Shannon E; Smith, Cynthia L
2011-02-01
We assessed whether a shared site intergenerational care program informed by contact theory contributed to more desirable social behaviors of elders and children during intergenerational programming than a center with a more traditional programming approach that lacks some or all of the contact theory tenets. We observed 59 elder and child participants from the two sites during intergenerational activities. Using the Intergenerational Observation Scale, we coded participants' predominant behavior in 15-s intervals through each activity's duration. We then calculated for each individual the percentage of time frames each behavior code was predominant. Participants at the theory-based program demonstrated higher rates of intergenerational interaction, higher rates of solitary behavior, and lower rates of watching than at the traditional program. Contact theory tenets were optimized when coupled with evidence-based practices. Intergenerational programs with stakeholder support that promotes equal group status, cooperation toward a common goal, and mechanisms of friendship among participants can achieve important objectives for elder and child participants in care settings.
Project : semi-autonomous parking for enhanced safety and efficiency.
DOT National Transportation Integrated Search
2016-04-01
Index coding, a coding formulation traditionally analyzed in the theoretical computer science and : information theory communities, has received considerable attention in recent years due to its value in : wireless communications and networking probl...
High-density digital recording
NASA Technical Reports Server (NTRS)
Kalil, F. (Editor); Buschman, A. (Editor)
1985-01-01
The problems associated with high-density digital recording (HDDR) are discussed. Five independent users of HDDR systems and their problems, solutions, and insights are provided as guidance for other users of HDDR systems. Various pulse code modulation coding techniques are reviewed. An introduction to error detection and correction head optimization theory and perpendicular recording are provided. Competitive tape recorder manufacturers apply all of the above theories and techniques and present their offerings. The methodology used by the HDDR Users Subcommittee of THIC to evaluate parallel HDDR systems is presented.
1981-01-01
Channel and study permutation codes as a special case. ,uch a code is generated by an initial vector x, a group G of orthogonal n by n matrices, and a...random-access components, is introduced and studied . Under this scheme, the network stations are divided into groups , each of which is assigned a...IEEE INFORMATION THEORY GROUP CO-SPONSORED BY: UNION RADIO SCIENTIFIQUE INTERNATIONALE IEEE Catalog Number 81 CH 1609-7 IT . 81 ~20 04Q SECURITY
Message Into Medium: An Extension of the Dual Coding Hypothesis.
ERIC Educational Resources Information Center
Simpson, Timothy J.
This paper examines the dual coding hypothesis, a model of the coding of visual and textual information, from the perspective of a mass media professional, such as a teacher, interested in accurately presenting both visual and textual material to a mass audience (i.e., students). It offers an extension to the theory, based upon the various skill…
A Need for a Theory of Visual Literacy.
ERIC Educational Resources Information Center
Hortin, John A.
1982-01-01
Examines sources available for developing a theory of visual literacy and attempts to clarify the meaning of the term. Suggests that visual thinking, a concept supported by recent research on mental imagery, visualization, and dual coding, ought to be the emphasis for future theory development. (FL)
NASA Technical Reports Server (NTRS)
Shapiro, Wilbur
1996-01-01
This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.
Using Prospect Theory to Investigate Decision-Making Bias Within an Information Security Context
2005-12-01
risk was acceptable, 5 when to the CA the risk was so bad...Population Proportion Lower Tail: Risk Averse (A) Coded as 0. Risk Seeking (B) Coded as 1. Ho (indifferent in risk behavior): p = . 5 Ha ( risk averse...Averse (A) Coded as 0. Risk Seeking (B) Coded as 1. Ho (indifferent in risk behavior): p = . 5 Ha ( risk averse thus significantly below . 5 ): p < . 5
Does theory influence the effectiveness of health behavior interventions? Meta-analysis.
Prestwich, Andrew; Sniehotta, Falko F; Whittington, Craig; Dombrowski, Stephan U; Rogers, Lizzie; Michie, Susan
2014-05-01
To systematically investigate the extent and type of theory use in physical activity and dietary interventions, as well as associations between extent and type of theory use with intervention effectiveness. An in-depth analysis of studies included in two systematic reviews of physical activity and healthy eating interventions (k = 190). Extent and type of theory use was assessed using the Theory Coding Scheme (TCS) and intervention effectiveness was calculated using Hedges's g. Metaregressions assessed the relationships between these measures. Fifty-six percent of interventions reported a theory base. Of these, 90% did not report links between all of their behavior change techniques (BCTs) with specific theoretical constructs and 91% did not report links between all the specified constructs with BCTs. The associations between a composite score or specific items on the TCS and intervention effectiveness were inconsistent. Interventions based on Social Cognitive Theory or the Transtheoretical Model were similarly effective and no more effective than interventions not reporting a theory base. The coding of theory in these studies suggested that theory was not often used extensively in the development of interventions. Moreover, the relationships between type of theory used and the extent of theory use with effectiveness were generally weak. The findings suggest that attempts to apply the two theories commonly used in this review more extensively are unlikely to increase intervention effectiveness. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Error-correction coding for digital communications
NASA Astrophysics Data System (ADS)
Clark, G. C., Jr.; Cain, J. B.
This book is written for the design engineer who must build the coding and decoding equipment and for the communication system engineer who must incorporate this equipment into a system. It is also suitable as a senior-level or first-year graduate text for an introductory one-semester course in coding theory. Fundamental concepts of coding are discussed along with group codes, taking into account basic principles, practical constraints, performance computations, coding bounds, generalized parity check codes, polynomial codes, and important classes of group codes. Other topics explored are related to simple nonalgebraic decoding techniques for group codes, soft decision decoding of block codes, algebraic techniques for multiple error correction, the convolutional code structure and Viterbi decoding, syndrome decoding techniques, and sequential decoding techniques. System applications are also considered, giving attention to concatenated codes, coding for the white Gaussian noise channel, interleaver structures for coded systems, and coding for burst noise channels.
Allan Urho Paivio (1925-2016).
Katz, Albert
2017-01-01
Presents an obituary for Allan Urho Paivio, who passed away on June 19, 2016. One of his many accomplishments was the development of a dual-coding theory. Dual-coding theory, while very influential, has, unfortunately, been discussed all too often as a theory of mental imagery, and not within its larger context wherein mental imagery and verbal processes are separable but interconnected representational systems, an academic battle in clarification that Paivio waged much of his life. He published prodigiously, even in the decades after retirement. Among his many honors, he served as president of the Canadian Psychological Association (1975) and was appointed a fellow of the Royal Society of Canada (1978). (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Verification of a magnetic island in gyro-kinetics by comparison with analytic theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zarzoso, D., E-mail: david.zarzoso-fernandez@polytechnique.org; Casson, F. J.; Poli, E.
A rotating magnetic island is imposed in the gyrokinetic code GKW, when finite differences are used for the radial direction, in order to develop the predictions of analytic tearing mode theory and understand its limitations. The implementation is verified against analytics in sheared slab geometry with three numerical tests that are suggested as benchmark cases for every code that imposes a magnetic island. The convergence requirements to properly resolve physics around the island separatrix are investigated. In the slab geometry, at low magnetic shear, binormal flows inside the island can drive Kelvin-Helmholtz instabilities which prevent the formation of the steadymore » state for which the analytic theory is formulated.« less
Mosheiff, Noga; Agmon, Haggai; Moriel, Avraham; Burak, Yoram
2017-06-01
Grid cells in the entorhinal cortex encode the position of an animal in its environment with spatially periodic tuning curves with different periodicities. Recent experiments established that these cells are functionally organized in discrete modules with uniform grid spacing. Here we develop a theory for efficient coding of position, which takes into account the temporal statistics of the animal's motion. The theory predicts a sharp decrease of module population sizes with grid spacing, in agreement with the trend seen in the experimental data. We identify a simple scheme for readout of the grid cell code by neural circuitry, that can match in accuracy the optimal Bayesian decoder. This readout scheme requires persistence over different timescales, depending on the grid cell module. Thus, we propose that the brain may employ an efficient representation of position which takes advantage of the spatiotemporal statistics of the encoded variable, in similarity to the principles that govern early sensory processing.
Mosheiff, Noga; Agmon, Haggai; Moriel, Avraham
2017-01-01
Grid cells in the entorhinal cortex encode the position of an animal in its environment with spatially periodic tuning curves with different periodicities. Recent experiments established that these cells are functionally organized in discrete modules with uniform grid spacing. Here we develop a theory for efficient coding of position, which takes into account the temporal statistics of the animal’s motion. The theory predicts a sharp decrease of module population sizes with grid spacing, in agreement with the trend seen in the experimental data. We identify a simple scheme for readout of the grid cell code by neural circuitry, that can match in accuracy the optimal Bayesian decoder. This readout scheme requires persistence over different timescales, depending on the grid cell module. Thus, we propose that the brain may employ an efficient representation of position which takes advantage of the spatiotemporal statistics of the encoded variable, in similarity to the principles that govern early sensory processing. PMID:28628647
ERIC Educational Resources Information Center
Presno, Caroline
1997-01-01
Discusses computer instruction in light of Bruner's theory of three forms of representation (action, icons, and symbols). Examines how studies regarding Paivio's dual-coding theory and studies focusing on procedural knowledge support Bruner's theory. Provides specific examples for instruction in three categories: demonstrations, pictures and…
Effects of Visual/Verbal Associations.
ERIC Educational Resources Information Center
Martin, Anna C.
Different effects of instructional strategies on recall and comprehension of terms frequently used in formal analysis of art were examined. The study looked at a synthesis of three theoretical positions: dual-coding theory, schema theory, and elaboration theory. Two-hundred and fifty sixth-grade students were randomly assigned to three groups:…
Implications of Information Theory for Computational Modeling of Schizophrenia
Wibral, Michael; Phillips, William A.
2017-01-01
Information theory provides a formal framework within which information processing and its disorders can be described. However, information theory has rarely been applied to modeling aspects of the cognitive neuroscience of schizophrenia. The goal of this article is to highlight the benefits of an approach based on information theory, including its recent extensions, for understanding several disrupted neural goal functions as well as related cognitive and symptomatic phenomena in schizophrenia. We begin by demonstrating that foundational concepts from information theory—such as Shannon information, entropy, data compression, block coding, and strategies to increase the signal-to-noise ratio—can be used to provide novel understandings of cognitive impairments in schizophrenia and metrics to evaluate their integrity. We then describe more recent developments in information theory, including the concepts of infomax, coherent infomax, and coding with synergy, to demonstrate how these can be used to develop computational models of schizophrenia-related failures in the tuning of sensory neurons, gain control, perceptual organization, thought organization, selective attention, context processing, predictive coding, and cognitive control. Throughout, we demonstrate how disordered mechanisms may explain both perceptual/cognitive changes and symptom emergence in schizophrenia. Finally, we demonstrate that there is consistency between some information-theoretic concepts and recent discoveries in neurobiology, especially involving the existence of distinct sites for the accumulation of driving input and contextual information prior to their interaction. This convergence can be used to guide future theory, experiment, and treatment development. PMID:29601053
A Value-Based Hierarchy of Objectives for Military Decision-Making
1991-09-01
r"EHICS, ZECISION MAKING, DECISION THEORY ,ý LEADERSHIP , 101 LEAERSHIP TRAINIING) ` ~16. PRICI CODE 17. SfCURITY CLA SIPIATIOW 18. SECURITY...29, Just- War Theory ................ I..............30 Jus ad bellum............................ .32 Jus in bello.................33...professional competence, and elements of just war theory such as proportionality and discrimination. A review of the relevant literature on just war theory
Adaptive neural coding: from biological to behavioral decision-making
Louie, Kenway; Glimcher, Paul W.; Webb, Ryan
2015-01-01
Empirical decision-making in diverse species deviates from the predictions of normative choice theory, but why such suboptimal behavior occurs is unknown. Here, we propose that deviations from optimality arise from biological decision mechanisms that have evolved to maximize choice performance within intrinsic biophysical constraints. Sensory processing utilizes specific computations such as divisive normalization to maximize information coding in constrained neural circuits, and recent evidence suggests that analogous computations operate in decision-related brain areas. These adaptive computations implement a relative value code that may explain the characteristic context-dependent nature of behavioral violations of classical normative theory. Examining decision-making at the computational level thus provides a crucial link between the architecture of biological decision circuits and the form of empirical choice behavior. PMID:26722666
Applications of Derandomization Theory in Coding
NASA Astrophysics Data System (ADS)
Cheraghchi, Mahdi
2011-07-01
Randomized techniques play a fundamental role in theoretical computer science and discrete mathematics, in particular for the design of efficient algorithms and construction of combinatorial objects. The basic goal in derandomization theory is to eliminate or reduce the need for randomness in such randomized constructions. In this thesis, we explore some applications of the fundamental notions in derandomization theory to problems outside the core of theoretical computer science, and in particular, certain problems related to coding theory. First, we consider the wiretap channel problem which involves a communication system in which an intruder can eavesdrop a limited portion of the transmissions, and construct efficient and information-theoretically optimal communication protocols for this model. Then we consider the combinatorial group testing problem. In this classical problem, one aims to determine a set of defective items within a large population by asking a number of queries, where each query reveals whether a defective item is present within a specified group of items. We use randomness condensers to explicitly construct optimal, or nearly optimal, group testing schemes for a setting where the query outcomes can be highly unreliable, as well as the threshold model where a query returns positive if the number of defectives pass a certain threshold. Finally, we design ensembles of error-correcting codes that achieve the information-theoretic capacity of a large class of communication channels, and then use the obtained ensembles for construction of explicit capacity achieving codes. [This is a shortened version of the actual abstract in the thesis.
NASA Astrophysics Data System (ADS)
Yang, Qianli; Pitkow, Xaq
2015-03-01
Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.
Simulations of linear and Hamming codes using SageMath
NASA Astrophysics Data System (ADS)
Timur, Tahta D.; Adzkiya, Dieky; Soleha
2018-03-01
Digital data transmission over a noisy channel could distort the message being transmitted. The goal of coding theory is to ensure data integrity, that is, to find out if and where this noise has distorted the message and what the original message was. Data transmission consists of three stages: encoding, transmission, and decoding. Linear and Hamming codes are codes that we discussed in this work, where encoding algorithms are parity check and generator matrix, and decoding algorithms are nearest neighbor and syndrome. We aim to show that we can simulate these processes using SageMath software, which has built-in class of coding theory in general and linear codes in particular. First we consider the message as a binary vector of size k. This message then will be encoded to a vector with size n using given algorithms. And then a noisy channel with particular value of error probability will be created where the transmission will took place. The last task would be decoding, which will correct and revert the received message back to the original message whenever possible, that is, if the number of error occurred is smaller or equal to the correcting radius of the code. In this paper we will use two types of data for simulations, namely vector and text data.
UFO: A THREE-DIMENSIONAL NEUTRON DIFFUSION CODE FOR THE IBM 704
DOE Office of Scientific and Technical Information (OSTI.GOV)
Auerbach, E.H.; Jewett, J.P.; Ketchum, M.A.
A description of UFO, a code for the solution of the fewgroup neutron diffusion equation in three-dimensional Cartesian coordinates on the IBM 704, is given. An accelerated Liebmann flux iteration scheme is used, and optimum parameters can be calculated by the code whenever they are required. The theory and operation of the program are discussed. (auth)
Drugs, Guns, and Disadvantaged Youths: Co-Occurring Behavior and the Code of the Street
ERIC Educational Resources Information Center
Allen, Andrea N.; Lo, Celia C.
2012-01-01
Guided by Anderson's theory of the code of the street, this study explored social mechanisms linking individual-level disadvantage factors with the adoption of beliefs grounded in the code of the street and with drug trafficking and gun carrying--the co-occurring behavior shaping violence among young men in urban areas. Secondary data were…
A combinatorial model for dentate gyrus sparse coding
Severa, William; Parekh, Ojas; James, Conrad D.; ...
2016-12-29
The dentate gyrus forms a critical link between the entorhinal cortex and CA3 by providing a sparse version of the signal. Concurrent with this increase in sparsity, a widely accepted theory suggests the dentate gyrus performs pattern separation—similar inputs yield decorrelated outputs. Although an active region of study and theory, few logically rigorous arguments detail the dentate gyrus’s (DG) coding. We suggest a theoretically tractable, combinatorial model for this action. The model provides formal methods for a highly redundant, arbitrarily sparse, and decorrelated output signal.To explore the value of this model framework, we assess how suitable it is for twomore » notable aspects of DG coding: how it can handle the highly structured grid cell representation in the input entorhinal cortex region and the presence of adult neurogenesis, which has been proposed to produce a heterogeneous code in the DG. We find tailoring the model to grid cell input yields expansion parameters consistent with the literature. In addition, the heterogeneous coding reflects activity gradation observed experimentally. Lastly, we connect this approach with more conventional binary threshold neural circuit models via a formal embedding.« less
The dependence of frequency distributions on multiple meanings of words, codes and signs
NASA Astrophysics Data System (ADS)
Yan, Xiaoyong; Minnhagen, Petter
2018-01-01
The dependence of the frequency distributions due to multiple meanings of words in a text is investigated by deleting letters. By coding the words with fewer letters the number of meanings per coded word increases. This increase is measured and used as an input in a predictive theory. For a text written in English, the word-frequency distribution is broad and fat-tailed, whereas if the words are only represented by their first letter the distribution becomes exponential. Both distribution are well predicted by the theory, as is the whole sequence obtained by consecutively representing the words by the first L = 6 , 5 , 4 , 3 , 2 , 1 letters. Comparisons of texts written by Chinese characters and the same texts written by letter-codes are made and the similarity of the corresponding frequency-distributions are interpreted as a consequence of the multiple meanings of Chinese characters. This further implies that the difference of the shape for word-frequencies for an English text written by letters and a Chinese text written by Chinese characters is due to the coding and not to the language per se.
Predicting Properties of Unidirectional-Nanofiber Composites
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Handler, Louis M.; Manderscheid, Jane
2008-01-01
A theory for predicting mechanical, thermal, electrical, and other properties of unidirectional-nanofiber/matrix composite materials is based on the prior theory of micromechanics of composite materials. In the development of the present theory, the prior theory of micromechanics was extended, through progressive substructuring, to the level of detail of a nanoscale slice of a nanofiber. All the governing equations were then formulated at this level. The substructuring and the equations have been programmed in the ICAN/JAVA computer code, which was reported in "ICAN/JAVA: Integrated Composite Analyzer Recoded in Java" (LEW-17247), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 36. In a demonstration, the theory as embodied in the computer code was applied to a graphite-nanofiber/epoxy laminate and used to predict 25 properties. Most of the properties were found to be distributed along the through-the-thickness direction. Matrix-dependent properties were found to have bimodal through-the-thickness distributions with discontinuous changes from mode to mode.
A Formal Theory of Perception. Technical Report No. 161.
ERIC Educational Resources Information Center
Rottmayer, William Arthur
An attempt to build a mathematical model of a device that could learn geometry is discussed. The report discusses the background and motivation of the study, the coding problem, the derivation of Suppes "Stimulus-Response Theory of Finite Automata" used in the work in learning theory, and a summary of the technical work. (DB)
NASA Technical Reports Server (NTRS)
Baker, A. J.; Iannelli, G. S.; Manhardt, Paul D.; Orzechowski, J. A.
1993-01-01
This report documents the user input and output data requirements for the FEMNAS finite element Navier-Stokes code for real-gas simulations of external aerodynamics flowfields. This code was developed for the configuration aerodynamics branch of NASA ARC, under SBIR Phase 2 contract NAS2-124568 by Computational Mechanics Corporation (COMCO). This report is in two volumes. Volume 1 contains the theory for the derived finite element algorithm and describes the test cases used to validate the computer program described in the Volume 2 user guide.
NASA Technical Reports Server (NTRS)
Clement, J. D.; Kirby, K. D.
1973-01-01
Exploratory calculations were performed for several gas core breeder reactor configurations. The computational method involved the use of the MACH-1 one dimensional diffusion theory code and the THERMOS integral transport theory code for thermal cross sections. Computations were performed to analyze thermal breeder concepts and nonbreeder concepts. Analysis of breeders was restricted to the (U-233)-Th breeding cycle, and computations were performed to examine a range of parameters. These parameters include U-233 to hydrogen atom ratio in the gaseous cavity, carbon to thorium atom ratio in the breeding blanket, cavity size, and blanket size.
Hoare, Karen J; Decker, Eve
2016-01-01
Following the summer holidays of 2011, twelve girls returned to school pregnant in one high school in Auckland New Zealand (NZ). A health promotion leaflet that folded into.a small square containing a condom and was dubbed the 'teabag' was distributed to 15-18 year olds prior to the summer holiday of 2012, in order to increase their sexual health knowledge. This paper reports on the evaluation of the teabag from the students' perspective. During the first term of 2013, seventeen students from two high schools who had received the teabag were interviewed. Five were male and twelve female. Most (16) were of Pacific Island or Maori (indigenous New Zealanders) descent. Interviews were digitally recorded, transcribed, coded and categorised concurrently, in accordance with grounded theory methods. Theoretical sampling was employed and students who had perceptions of the teabag, that were consistent with evolving constructions from data, were invited by school nurses to be interviewed by the researchers. Interviews were coded line by line by two researchers and these codes collapsed into seven focussed codes. Further analysis resulted in the codes being subsumed into three main categories. These categories revealed that the teabag was, helpful, appropriate and became a talking point. The grounded theory and basic social process the researchers constructed from data were that the teabag catalysed conversations about sexual health. The teabag was an acceptable and appropriate sexual health promotion tool to disseminate information about sexual health.
Joint Machine Learning and Game Theory for Rate Control in High Efficiency Video Coding.
Gao, Wei; Kwong, Sam; Jia, Yuheng
2017-08-25
In this paper, a joint machine learning and game theory modeling (MLGT) framework is proposed for inter frame coding tree unit (CTU) level bit allocation and rate control (RC) optimization in High Efficiency Video Coding (HEVC). First, a support vector machine (SVM) based multi-classification scheme is proposed to improve the prediction accuracy of CTU-level Rate-Distortion (R-D) model. The legacy "chicken-and-egg" dilemma in video coding is proposed to be overcome by the learning-based R-D model. Second, a mixed R-D model based cooperative bargaining game theory is proposed for bit allocation optimization, where the convexity of the mixed R-D model based utility function is proved, and Nash bargaining solution (NBS) is achieved by the proposed iterative solution search method. The minimum utility is adjusted by the reference coding distortion and frame-level Quantization parameter (QP) change. Lastly, intra frame QP and inter frame adaptive bit ratios are adjusted to make inter frames have more bit resources to maintain smooth quality and bit consumption in the bargaining game optimization. Experimental results demonstrate that the proposed MLGT based RC method can achieve much better R-D performances, quality smoothness, bit rate accuracy, buffer control results and subjective visual quality than the other state-of-the-art one-pass RC methods, and the achieved R-D performances are very close to the performance limits from the FixedQP method.
Trace-shortened Reed-Solomon codes
NASA Technical Reports Server (NTRS)
Mceliece, R. J.; Solomon, G.
1994-01-01
Reed-Solomon (RS) codes have been part of standard NASA telecommunications systems for many years. RS codes are character-oriented error-correcting codes, and their principal use in space applications has been as outer codes in concatenated coding systems. However, for a given character size, say m bits, RS codes are limited to a length of, at most, 2(exp m). It is known in theory that longer character-oriented codes would be superior to RS codes in concatenation applications, but until recently no practical class of 'long' character-oriented codes had been discovered. In 1992, however, Solomon discovered an extensive class of such codes, which are now called trace-shortened Reed-Solomon (TSRS) codes. In this article, we will continue the study of TSRS codes. Our main result is a formula for the dimension of any TSRS code, as a function of its error-correcting power. Using this formula, we will give several examples of TSRS codes, some of which look very promising as candidate outer codes in high-performance coded telecommunications systems.
Trinker, Horst
2011-10-28
We study the distribution of triples of codewords of codes and ordered codes. Schrijver [A. Schrijver, New code upper bounds from the Terwilliger algebra and semidefinite programming, IEEE Trans. Inform. Theory 51 (8) (2005) 2859-2866] used the triple distribution of a code to establish a bound on the number of codewords based on semidefinite programming. In the first part of this work, we generalize this approach for ordered codes. In the second part, we consider linear codes and linear ordered codes and present a MacWilliams-type identity for the triple distribution of their dual code. Based on the non-negativity of this linear transform, we establish a linear programming bound and conclude with a table of parameters for which this bound yields better results than the standard linear programming bound.
Williams, David M
2010-09-01
Comments on the original article 'Are interventions theory-based? Development of a theory coding scheme' by Susan Michie and Andrew Prestwich (see record 2010-00152-001). In their admirable effort to develop a coding scheme for the theoretical contribution of intervention research, Michie and Prestwich rightly point out the importance of the presence of a comparison condition when examining the effect of an intervention on targeted theoretical variables and behavioral outcomes (Table 2, item 15). However, they fail to discuss the critical importance of the nature of the comparison condition. Weaker comparison conditions will yield stronger intervention effects; stronger comparison conditions will yield a stronger science of behavior change. (c) 2010 APA, all rights reserved).
Representational geometry: integrating cognition, computation, and the brain
Kriegeskorte, Nikolaus; Kievit, Rogier A.
2013-01-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. PMID:23876494
NASA Technical Reports Server (NTRS)
Payne, David G.; Gunther, Virginia A. L.
1988-01-01
Subjects performed short term memory tasks, involving both spatial and verbal components, and a visual monitoring task involving either analog or digital display formats. These two tasks (memory vs. monitoring) were performed both singly and in conjunction. Contrary to expectations derived from multiple resource theories of attentional processes, there was no evidence that when the two tasks involved the same cognitive codes (i.e., either both spatial or both verbal/linguistics) there was more of a dual task performance decrement than when the two tasks employed different cognitive codes/processes. These results are discussed in terms of their implications for theories of attentional processes and also for research in mental state estimation.
Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games.
Alber, Julia M; Watson, Anna M; Barnett, Tracey E; Mercado, Rebeccah; Bernhardt, Jay M
2015-07-01
Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development.
Development of a Coding Instrument to Assess the Quality and Content of Anti-Tobacco Video Games
Alber, Julia M.; Watson, Anna M.; Barnett, Tracey E.; Mercado, Rebeccah
2015-01-01
Abstract Previous research has shown the use of electronic video games as an effective method for increasing content knowledge about the risks of drugs and alcohol use for adolescents. Although best practice suggests that theory, health communication strategies, and game appeal are important characteristics for developing games, no instruments are currently available to examine the quality and content of tobacco prevention and cessation electronic games. This study presents the systematic development of a coding instrument to measure the quality, use of theory, and health communication strategies of tobacco cessation and prevention electronic games. Using previous research and expert review, a content analysis coding instrument measuring 67 characteristics was developed with three overarching categories: type and quality of games, theory and approach, and type and format of messages. Two trained coders applied the instrument to 88 games on four platforms (personal computer, Nintendo DS, iPhone, and Android phone) to field test the instrument. Cohen's kappa for each item ranged from 0.66 to 1.00, with an average kappa value of 0.97. Future research can adapt this coding instrument to games addressing other health issues. In addition, the instrument questions can serve as a useful guide for evidence-based game development. PMID:26167842
Laminar fMRI and computational theories of brain function.
Stephan, K E; Petzschner, F H; Kasper, L; Bayer, J; Wellstein, K V; Stefanics, G; Pruessmann, K P; Heinzle, J
2017-11-02
Recently developed methods for functional MRI at the resolution of cortical layers (laminar fMRI) offer a novel window into neurophysiological mechanisms of cortical activity. Beyond physiology, laminar fMRI also offers an unprecedented opportunity to test influential theories of brain function. Specifically, hierarchical Bayesian theories of brain function, such as predictive coding, assign specific computational roles to different cortical layers. Combined with computational models, laminar fMRI offers a unique opportunity to test these proposals noninvasively in humans. This review provides a brief overview of predictive coding and related hierarchical Bayesian theories, summarises their predictions with regard to layered cortical computations, examines how these predictions could be tested by laminar fMRI, and considers methodological challenges. We conclude by discussing the potential of laminar fMRI for clinically useful computational assays of layer-specific information processing. Copyright © 2017 Elsevier Inc. All rights reserved.
Smith, Daniel G A; Burns, Lori A; Sirianni, Dominic A; Nascimento, Daniel R; Kumar, Ashutosh; James, Andrew M; Schriber, Jeffrey B; Zhang, Tianyuan; Zhang, Boyi; Abbott, Adam S; Berquist, Eric J; Lechner, Marvin H; Cunha, Leonardo A; Heide, Alexander G; Waldrop, Jonathan M; Takeshita, Tyler Y; Alenaizan, Asem; Neuhauser, Daniel; King, Rollin A; Simmonett, Andrew C; Turney, Justin M; Schaefer, Henry F; Evangelista, Francesco A; DePrince, A Eugene; Crawford, T Daniel; Patkowski, Konrad; Sherrill, C David
2018-06-11
Psi4NumPy demonstrates the use of efficient computational kernels from the open-source Psi4 program through the popular NumPy library for linear algebra in Python to facilitate the rapid development of clear, understandable Python computer code for new quantum chemical methods, while maintaining a relatively low execution time. Using these tools, reference implementations have been created for a number of methods, including self-consistent field (SCF), SCF response, many-body perturbation theory, coupled-cluster theory, configuration interaction, and symmetry-adapted perturbation theory. Furthermore, several reference codes have been integrated into Jupyter notebooks, allowing background, underlying theory, and formula information to be associated with the implementation. Psi4NumPy tools and associated reference implementations can lower the barrier for future development of quantum chemistry methods. These implementations also demonstrate the power of the hybrid C++/Python programming approach employed by the Psi4 program.
NASA Technical Reports Server (NTRS)
Cartier, D. E.
1973-01-01
A convolutional coding theory is given for the IME and the Heliocentric spacecraft. The amount of coding gain needed by the mission is determined. Recommendations are given for an encoder/decoder system to provide the gain along with an evaluation of the impact of the system on the space network in terms of costs and complexity.
Cognitive Architectures for Multimedia Learning
ERIC Educational Resources Information Center
Reed, Stephen K.
2006-01-01
This article provides a tutorial overview of cognitive architectures that can form a theoretical foundation for designing multimedia instruction. Cognitive architectures include a description of memory stores, memory codes, and cognitive operations. Architectures that are relevant to multimedia learning include Paivio's dual coding theory,…
New quantum codes constructed from quaternary BCH codes
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Guo, Luobin; Ma, Yuena
2016-10-01
In this paper, we firstly study construction of new quantum error-correcting codes (QECCs) from three classes of quaternary imprimitive BCH codes. As a result, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing quaternary BCH codes are determined to be much larger than the result given according to Aly et al. (IEEE Trans Inf Theory 53:1183-1188, 2007) for each different code length. Thus, families of new QECCs are newly obtained, and the constructed QECCs have larger distance than those in the previous literature. Secondly, we apply a combinatorial construction to the imprimitive BCH codes with their corresponding primitive counterpart and construct many new linear quantum codes with good parameters, some of which have parameters exceeding the finite Gilbert-Varshamov bound for linear quantum codes.
New quantum codes derived from a family of antiprimitive BCH codes
NASA Astrophysics Data System (ADS)
Liu, Yang; Li, Ruihu; Lü, Liangdong; Guo, Luobin
The Bose-Chaudhuri-Hocquenghem (BCH) codes have been studied for more than 57 years and have found wide application in classical communication system and quantum information theory. In this paper, we study the construction of quantum codes from a family of q2-ary BCH codes with length n=q2m+1 (also called antiprimitive BCH codes in the literature), where q≥4 is a power of 2 and m≥2. By a detailed analysis of some useful properties about q2-ary cyclotomic cosets modulo n, Hermitian dual-containing conditions for a family of non-narrow-sense antiprimitive BCH codes are presented, which are similar to those of q2-ary primitive BCH codes. Consequently, via Hermitian Construction, a family of new quantum codes can be derived from these dual-containing BCH codes. Some of these new antiprimitive quantum BCH codes are comparable with those derived from primitive BCH codes.
TRIQS: A toolbox for research on interacting quantum systems
NASA Astrophysics Data System (ADS)
Parcollet, Olivier; Ferrero, Michel; Ayral, Thomas; Hafermann, Hartmut; Krivenko, Igor; Messio, Laura; Seth, Priyanka
2015-11-01
We present the TRIQS library, a Toolbox for Research on Interacting Quantum Systems. It is an open-source, computational physics library providing a framework for the quick development of applications in the field of many-body quantum physics, and in particular, strongly-correlated electronic systems. It supplies components to develop codes in a modern, concise and efficient way: e.g. Green's function containers, a generic Monte Carlo class, and simple interfaces to HDF5. TRIQS is a C++/Python library that can be used from either language. It is distributed under the GNU General Public License (GPLv3). State-of-the-art applications based on the library, such as modern quantum many-body solvers and interfaces between density-functional-theory codes and dynamical mean-field theory (DMFT) codes are distributed along with it.
Partially Key Distribution with Public Key Cryptosystem Based on Error Control Codes
NASA Astrophysics Data System (ADS)
Tavallaei, Saeed Ebadi; Falahati, Abolfazl
Due to the low level of security in public key cryptosystems based on number theory, fundamental difficulties such as "key escrow" in Public Key Infrastructure (PKI) and a secure channel in ID-based cryptography, a new key distribution cryptosystem based on Error Control Codes (ECC) is proposed . This idea is done by some modification on McEliece cryptosystem. The security of ECC cryptosystem obtains from the NP-Completeness of block codes decoding. The capability of generating public keys with variable lengths which is suitable for different applications will be provided by using ECC. It seems that usage of these cryptosystems because of decreasing in the security of cryptosystems based on number theory and increasing the lengths of their keys would be unavoidable in future.
KAVEH, MOHAMMAD HOSSEIN; MORADI, LEILA; HESAMPOUR, MARYAM; HASAN ZADEH, JAFAR
2015-01-01
Introduction Recognizing the determinants of behavior plays a major role in identification and application of effective strategies for encouraging individuals to follow the intended pattern of behavior. The present study aimed to analyze the university students’ behaviors regarding the amenability to dress code, using the theory of reasoned action (TRA). Methods In this cross sectional study, 472 students were selected through multi-stage random sampling. The data were collected using a researcher-made questionnaire whose validity was confirmed by specialists. Besides, its reliability was confirmed by conducting a pilot study revealing Cronbach’s alpha coefficients of 0.93 for attitude, 0.83 for subjective norms, 0.94 for behavioral intention and 0.77 for behavior. The data were entered into the SPSS statistical software and analyzed using descriptive and inferential statistics (Mann-Whitney, correlation and regression analysis). Results Based on the students’ self-reports, conformity of clothes to the university’s dress code was below the expected level in 28.87% of the female students and 28.55% of the male ones. The mean scores of attitude, subjective norms, and behavioral intention to comply with dress code policy were 28.78±10.08, 28.51±8.25 and 11.12±3.84, respectively. The students of different colleges were different from each other concerning TRA constructs. Yet, subjective norms played a more critical role in explaining the variance of dress code behavior among the students. Conclusion Theory of reasoned action explained the students’ dress code behaviors relatively well. The study results suggest paying attention to appropriate approaches in educational, cultural activities, including promotion of student-teacher communication. PMID:26269790
Kaveh, Mohammad Hossein; Moradi, Leila; Hesampour, Maryam; Hasan Zadeh, Jafar
2015-07-01
Recognizing the determinants of behavior plays a major role in identification and application of effective strategies for encouraging individuals to follow the intended pattern of behavior. The present study aimed to analyze the university students' behaviors regarding the amenability to dress code, using the theory of reasoned action (TRA). In this cross sectional study, 472 students were selected through multi-stage random sampling. The data were collected using a researcher-made questionnaire whose validity was confirmed by specialists. Besides, its reliability was confirmed by conducting a pilot study revealing Cronbach's alpha coefficients of 0.93 for attitude, 0.83 for subjective norms, 0.94 for behavioral intention and 0.77 for behavior. The data were entered into the SPSS statistical software and analyzed using descriptive and inferential statistics (Mann-Whitney, correlation and regression analysis). Based on the students' self-reports, conformity of clothes to the university's dress code was below the expected level in 28.87% of the female students and 28.55% of the male ones. The mean scores of attitude, subjective norms, and behavioral intention to comply with dress code policy were 28.78±10.08, 28.51±8.25 and 11.12±3.84, respectively. The students of different colleges were different from each other concerning TRA constructs. Yet, subjective norms played a more critical role in explaining the variance of dress code behavior among the students. Theory of reasoned action explained the students' dress code behaviors relatively well. The study results suggest paying attention to appropriate approaches in educational, cultural activities, including promotion of student-teacher communication.
Neural correlates of concreteness in semantic categorization.
Pexman, Penny M; Hargreaves, Ian S; Edwards, Jodi D; Henry, Luke C; Goodyear, Bradley G
2007-08-01
In some contexts, concrete words (CARROT) are recognized and remembered more readily than abstract words (TRUTH). This concreteness effect has historically been explained by two theories of semantic representation: dual-coding [Paivio, A. Dual coding theory: Retrospect and current status. Canadian Journal of Psychology, 45, 255-287, 1991] and context-availability [Schwanenflugel, P. J. Why are abstract concepts hard to understand? In P. J. Schwanenflugel (Ed.), The psychology of word meanings (pp. 223-250). Hillsdale, NJ: Erlbaum, 1991]. Past efforts to adjudicate between these theories using functional magnetic resonance imaging have produced mixed results. Using event-related functional magnetic resonance imaging, we reexamined this issue with a semantic categorization task that allowed for uniform semantic judgments of concrete and abstract words. The participants were 20 healthy adults. Functional analyses contrasted activation associated with concrete and abstract meanings of ambiguous and unambiguous words. Results showed that for both ambiguous and unambiguous words, abstract meanings were associated with more widespread cortical activation than concrete meanings in numerous regions associated with semantic processing, including temporal, parietal, and frontal cortices. These results are inconsistent with both dual-coding and context-availability theories, as these theories propose that the representations of abstract concepts are relatively impoverished. Our results suggest, instead, that semantic retrieval of abstract concepts involves a network of association areas. We argue that this finding is compatible with a theory of semantic representation such as Barsalou's [Barsalou, L. W. Perceptual symbol systems. Behavioral & Brain Sciences, 22, 577-660, 1999] perceptual symbol systems, whereby concrete and abstract concepts are represented by similar mechanisms but with differences in focal content.
NASA Astrophysics Data System (ADS)
Harman, C. J.
2015-12-01
Even amongst the academic community, new theoretical tools can remain underutilized due to the investment of time and resources required to understand and implement them. This surely limits the frequency that new theory is rigorously tested against data by scientists outside the group that developed it, and limits the impact that new tools could have on the advancement of science. Reducing the barriers to adoption through online education and open-source code can bridge the gap between theory and data, forging new collaborations, and advancing science. A pilot venture aimed at increasing the adoption of a new theory of time-variable transit time distributions was begun in July 2015 as a collaboration between Johns Hopkins University and The Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). There were four main components to the venture: a public online seminar covering the theory, an open source code repository, a virtual short course designed to help participants apply the theory to their data, and an online forum to maintain discussion and build a community of users. 18 participants were selected for the non-public components based on their responses in an application, and were asked to fill out a course evaluation at the end of the short course, and again several months later. These evaluations, along with participation in the forum and on-going contact with the organizer suggest strengths and weaknesses in this combination of components to assist participants in adopting new tools.
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Aboudi, Jacob
2000-01-01
The objective of this two-year project was to develop and deliver to the NASA-Glenn Research Center a two-dimensional higher-order theory, and related computer codes, for the analysis and design of cylindrical functionally graded materials/structural components for use in advanced aircraft engines (e.g., combustor linings, rotor disks, heat shields, brisk blades). To satisfy this objective, two-dimensional version of the higher-order theory, HOTCFGM-2D, and four computer codes based on this theory, for the analysis and design of structural components functionally graded in the radial and circumferential directions were developed in the cylindrical coordinate system r-Theta-z. This version of the higher-order theory is a significant generalization of the one-dimensional theory, HOTCFGM-1D, developed during the FY97 for the analysis and design of cylindrical structural components with radially graded microstructures. The generalized theory is applicable to thin multi-phased composite shells/cylinders subjected to steady-state thermomechanical, transient thermal and inertial loading applied uniformly along the axial direction such that the overall deformation is characterized by a constant average axial strain. The reinforcement phases are uniformly distributed in the axial direction, and arbitrarily distributed in the radial and circumferential direction, thereby allowing functional grading of the internal reinforcement in the r-Theta plane. The four computer codes fgmc3dq.cylindrical.f, fgmp3dq.cylindrical.f, fgmgvips3dq.cylindrical.f, and fgmc3dq.cylindrical.transient.f are research-oriented codes for investigating the effect of functionally graded architectures, as well as the properties of the multi-phase reinforcement, in thin shells subjected to thermomechanical and inertial loading, on the internal temperature, stress and (inelastic) strain fields. The reinforcement distribution in the radial and circumferential directions is specified by the user. The thermal and inelastic properties of the individual phases can vary with temperature. The inelastic phases are presently modeled by the power-law creep model generalized to multi-directional loading (within fgmc3dq.cylindrical.f and fgmc3dq.cylindrical.transient.f for steady-state and transient thermal loading, respectively), and incremental plasticity and GVIPS unified viscoplasticity theories (within the steady-state loading versions fgmp3dq.cylindrical.f and fgmgvips3dq.cylindrical.f).
Bilinguals' Creativity and Syntactic Theory: Evidence for Emerging Grammar.
ERIC Educational Resources Information Center
Bhatia, Tej K.
1989-01-01
Examines a code mixed variety of English and Hindi called Filmi English, which reflects the linguistic influence of the Indian film industry. A corpus of more than 2,000 intrasentential code-mixed sentences drawn from a film magazine, "Stardust," is analyzed. (Author/OD)
ERIC Educational Resources Information Center
Simpson, Timothy J.
Paivio's Dual Coding Theory has received widespread recognition for its connection between visual and aural channels of internal information processing. The use of only two channels, however, cannot satisfactorily explain the effects witnessed every day. This paper presents a study suggesting the presence a third, kinesthetic channel, currently…
Leaky gate model: intensity-dependent coding of pain and itch in the spinal cord
Sun, Shuohao; Xu, Qian; Guo, Changxiong; Guan, Yun; Liu, Qin; Dong, Xinzhong
2017-01-01
SUMMARY Coding of itch versus pain has been heatedly debated for decades. However, the current coding theories (labeled line, intensity and selectivity theory) cannot accommodate all experimental observations. Here we identified a subset of spinal interneurons, labeled by gastrin releasing peptide (Grp), that receive direct synaptic input from both pain and itch primary sensory neurons. When activated, these Grp+ neurons generated rarely-seen simultaneous robust pain and itch responses that were intensity-dependent. Accordingly, we propose a “leaky gate” model, in which Grp+ neurons transmit both itch and weak pain signals, however upon strong painful stimuli the recruitment of endogenous opioids works to close this gate, reducing overwhelming pain generated by parallel pathways. Consistent with our model, loss of these Grp+ neurons increased pain responses while itch was decreased. Our new model serves as an example of non-monotonic coding in the spinal cord and better explains observations in human psychophysical studies. PMID:28231466
The COREL and W12SC3 computer programs for supersonic wing design and analysis
NASA Technical Reports Server (NTRS)
Mason, W. H.; Rosen, B. S.
1983-01-01
Two computer codes useful in the supersonic aerodynamic design of wings, including the supersonic maneuver case are described. The nonlinear full potential equation COREL code performs an analysis of a spanwise section of the wing in the crossflow plane by assuming conical flow over the section. A subsequent approximate correction to the solution can be made in order to account for nonconical effects. In COREL, the flow-field is assumed to be irrotional (Mach numbers normal to shock waves less than about 1.3) and the full potential equation is solved to obtain detailed results for the leading edge expansion, supercritical crossflow, and any crossflow shockwaves. W12SC3 is a linear theory panel method which combines and extends elements of several of Woodward's codes, with emphasis on fighter applications. After a brief review of the aerodynamic theory used by each method, the use of the codes is illustrated with several examples, detailed input instructions and a sample case.
Coupled-cluster based R-matrix codes (CCRM): Recent developments
NASA Astrophysics Data System (ADS)
Sur, Chiranjib; Pradhan, Anil K.
2008-05-01
We report the ongoing development of the new coupled-cluster R-matrix codes (CCRM) for treating electron-ion scattering and radiative processes within the framework of the relativistic coupled-cluster method (RCC), interfaced with the standard R-matrix methodology. The RCC method is size consistent and in principle equivalent to an all-order many-body perturbation theory. The RCC method is one of the most accurate many-body theories, and has been applied for several systems. This project should enable the study of electron-interactions with heavy atoms/ions, utilizing not only high speed computing platforms but also improved theoretical description of the relativistic and correlation effects for the target atoms/ions as treated extensively within the RCC method. Here we present a comprehensive outline of the newly developed theoretical method and a schematic representation of the new suite of CCRM codes. We begin with the flowchart and description of various stages involved in this development. We retain the notations and nomenclature of different stages as analogous to the standard R-matrix codes.
Studies of Planet Formation Using a Hybrid N-Body + Planetesimal Code
NASA Technical Reports Server (NTRS)
Kenyon, Scott J.
2004-01-01
The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: (1) icy planets: models for icy planet formation will demonstrate how the physical properties of debris disks - including the Kuiper Belt in our solar system - depend on initial conditions and input physics; and (2) terrestrial planets: calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment.
Towards Holography via Quantum Source-Channel Codes.
Pastawski, Fernando; Eisert, Jens; Wilming, Henrik
2017-07-14
While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.
Towards Holography via Quantum Source-Channel Codes
NASA Astrophysics Data System (ADS)
Pastawski, Fernando; Eisert, Jens; Wilming, Henrik
2017-07-01
While originally motivated by quantum computation, quantum error correction (QEC) is currently providing valuable insights into many-body quantum physics, such as topological phases of matter. Furthermore, mounting evidence originating from holography research (AdS/CFT) indicates that QEC should also be pertinent for conformal field theories. With this motivation in mind, we introduce quantum source-channel codes, which combine features of lossy compression and approximate quantum error correction, both of which are predicted in holography. Through a recent construction for approximate recovery maps, we derive guarantees on its erasure decoding performance from calculations of an entropic quantity called conditional mutual information. As an example, we consider Gibbs states of the transverse field Ising model at criticality and provide evidence that they exhibit nontrivial protection from local erasure. This gives rise to the first concrete interpretation of a bona fide conformal field theory as a quantum error correcting code. We argue that quantum source-channel codes are of independent interest beyond holography.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, Maria N.; Salko, Robert K.
Coolant-Boiling in Rod Arrays|Two Fluids (COBRA-TF) is a thermal/ hydraulic (T/H) simulation code designed for light water reactor (LWR) vessel analysis. It uses a two-fluid, three-field (i.e. fluid film, fluid drops, and vapor) modeling approach. Both sub-channel and 3D Cartesian forms of 9 conservation equations are available for LWR modeling. The code was originally developed by Pacific Northwest Laboratory in 1980 and had been used and modified by several institutions over the last few decades. COBRA-TF also found use at the Pennsylvania State University (PSU) by the Reactor Dynamics and Fuel Management Group (RDFMG) and has been improved, updated, andmore » subsequently re-branded as CTF. As part of the improvement process, it was necessary to generate sufficient documentation for the open-source code which had lacked such material upon being adopted by RDFMG. This document serves mainly as a theory manual for CTF, detailing the many two-phase heat transfer, drag, and important accident scenario models contained in the code as well as the numerical solution process utilized. Coding of the models is also discussed, all with consideration for updates that have been made when transitioning from COBRA-TF to CTF. Further documentation outside of this manual is also available at RDFMG which focus on code input deck generation and source code global variable and module listings.« less
Pedagogical Affordances of Multiple External Representations in Scientific Processes
ERIC Educational Resources Information Center
Wu, Hsin-Kai; Puntambekar, Sadhana
2012-01-01
Multiple external representations (MERs) have been widely used in science teaching and learning. Theories such as dual coding theory and cognitive flexibility theory have been developed to explain why the use of MERs is beneficial to learning, but they do not provide much information on pedagogical issues such as how and in what conditions MERs…
NASA Technical Reports Server (NTRS)
Silva, Walter A.
1993-01-01
A methodology for modeling nonlinear unsteady aerodynamic responses, for subsequent use in aeroservoelastic analysis and design, using the Volterra-Wiener theory of nonlinear systems is presented. The methodology is extended to predict nonlinear unsteady aerodynamic responses of arbitrary frequency. The Volterra-Wiener theory uses multidimensional convolution integrals to predict the response of nonlinear systems to arbitrary inputs. The CAP-TSD (Computational Aeroelasticity Program - Transonic Small Disturbance) code is used to generate linear and nonlinear unit impulse responses that correspond to each of the integrals for a rectangular wing with a NACA 0012 section with pitch and plunge degrees of freedom. The computed kernels then are used to predict linear and nonlinear unsteady aerodynamic responses via convolution and compared to responses obtained using the CAP-TSD code directly. The results indicate that the approach can be used to predict linear unsteady aerodynamic responses exactly for any input amplitude or frequency at a significant cost savings. Convolution of the nonlinear terms results in nonlinear unsteady aerodynamic responses that compare reasonably well with those computed using the CAP-TSD code directly but at significant computational cost savings.
molgw 1: Many-body perturbation theory software for atoms, molecules, and clusters
Bruneval, Fabien; Rangel, Tonatiuh; Hamed, Samia M.; ...
2016-07-12
Here, we summarize the MOLGW code that implements density-functional theory and many-body perturbation theory in a Gaussian basis set. The code is dedicated to the calculation of the many-body self-energy within the GW approximation and the solution of the Bethe–Salpeter equation. These two types of calculations allow the user to evaluate physical quantities that can be compared to spectroscopic experiments. Quasiparticle energies, obtained through the calculation of the GW self-energy, can be compared to photoemission or transport experiments, and neutral excitation energies and oscillator strengths, obtained via solution of the Bethe–Salpeter equation, are measurable by optical absorption. The implementation choicesmore » outlined here have aimed at the accuracy and robustness of calculated quantities with respect to measurements. Furthermore, the algorithms implemented in MOLGW allow users to consider molecules or clusters containing up to 100 atoms with rather accurate basis sets, and to choose whether or not to apply the resolution-of-the-identity approximation. Finally, we demonstrate the parallelization efficacy of the MOLGW code over several hundreds of processors.« less
Debugging Techniques Used by Experienced Programmers to Debug Their Own Code.
1990-09-01
IS. NUMBER OF PAGES code debugging 62 computer programmers 16. PRICE CODE debug programming 17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 119...Davis, and Schultz (1987) also compared experts and novices, but focused on the way a computer program is represented cognitively and how that...of theories in the emerging computer programming domain (Fisher, 1987). In protocol analysis, subjects are asked to talk/think aloud as they solve
Fast ITTBC using pattern code on subband segmentation
NASA Astrophysics Data System (ADS)
Koh, Sung S.; Kim, Hanchil; Lee, Kooyoung; Kim, Hongbin; Jeong, Hun; Cho, Gangseok; Kim, Chunghwa
2000-06-01
Iterated Transformation Theory-Based Coding suffers from very high computational complexity in encoding phase. This is due to its exhaustive search. In this paper, our proposed image coding algorithm preprocess an original image to subband segmentation image by wavelet transform before image coding to reduce encoding complexity. A similar block is searched by using the 24 block pattern codes which are coded by the edge information in the image block on the domain pool of the subband segmentation. As a result, numerical data shows that the encoding time of the proposed coding method can be reduced to 98.82% of that of Joaquin's method, while the loss in quality relative to the Jacquin's is about 0.28 dB in PSNR, which is visually negligible.
Concreteness Effects in Text Recall: Dual Coding or Context Availability?
ERIC Educational Resources Information Center
Sadoski, Mark; And Others
1995-01-01
Extends an earlier study by using different materials, ratings for familiarity, and more stringent experimental controls. Finds concreteness effects in two experiments using undergraduate students. Suggests that familiarity and concreteness contribute separately to recall. Supports a dual coding theory. Discusses implications for text design. (RS)
ERIC Educational Resources Information Center
Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien
2013-01-01
This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…
Performance Theories for Sentence Coding: Some Quantitative Models
ERIC Educational Resources Information Center
Aaronson, Doris; And Others
1977-01-01
This study deals with the patterns of word-by-word reading times over a sentence when the subject must code the linguistic information sufficiently for immediate verbatim recall. A class of quantitative models is considered that would account for reading times at phrase breaks. (Author/RM)
Task representation in individual and joint settings
Prinz, Wolfgang
2015-01-01
This paper outlines a framework for task representation and discusses applications to interference tasks in individual and joint settings. The framework is derived from the Theory of Event Coding (TEC). This theory regards task sets as transient assemblies of event codes in which stimulus and response codes interact and shape each other in particular ways. On the one hand, stimulus and response codes compete with each other within their respective subsets (horizontal interactions). On the other hand, stimulus and response code cooperate with each other (vertical interactions). Code interactions instantiating competition and cooperation apply to two time scales: on-line performance (i.e., doing the task) and off-line implementation (i.e., setting the task). Interference arises when stimulus and response codes overlap in features that are irrelevant for stimulus identification, but relevant for response selection. To resolve this dilemma, the feature profiles of event codes may become restructured in various ways. The framework is applied to three kinds of interference paradigms. Special emphasis is given to joint settings where tasks are shared between two participants. Major conclusions derived from these applications include: (1) Response competition is the chief driver of interference. Likewise, different modes of response competition give rise to different patterns of interference; (2) The type of features in which stimulus and response codes overlap is also a crucial factor. Different types of such features give likewise rise to different patterns of interference; and (3) Task sets for joint settings conflate intraindividual conflicts between responses (what), with interindividual conflicts between responding agents (whom). Features of response codes may, therefore, not only address responses, but also responding agents (both physically and socially). PMID:26029085
Representational geometry: integrating cognition, computation, and the brain.
Kriegeskorte, Nikolaus; Kievit, Rogier A
2013-08-01
The cognitive concept of representation plays a key role in theories of brain information processing. However, linking neuronal activity to representational content and cognitive theory remains challenging. Recent studies have characterized the representational geometry of neural population codes by means of representational distance matrices, enabling researchers to compare representations across stages of processing and to test cognitive and computational theories. Representational geometry provides a useful intermediate level of description, capturing both the information represented in a neuronal population code and the format in which it is represented. We review recent insights gained with this approach in perception, memory, cognition, and action. Analyses of representational geometry can compare representations between models and the brain, and promise to explain brain computation as transformation of representational similarity structure. Copyright © 2013 Elsevier Ltd. All rights reserved.
Advanced capabilities for materials modelling with Quantum ESPRESSO
NASA Astrophysics Data System (ADS)
Giannozzi, P.; Andreussi, O.; Brumme, T.; Bunau, O.; Buongiorno Nardelli, M.; Calandra, M.; Car, R.; Cavazzoni, C.; Ceresoli, D.; Cococcioni, M.; Colonna, N.; Carnimeo, I.; Dal Corso, A.; de Gironcoli, S.; Delugas, P.; DiStasio, R. A., Jr.; Ferretti, A.; Floris, A.; Fratesi, G.; Fugallo, G.; Gebauer, R.; Gerstmann, U.; Giustino, F.; Gorni, T.; Jia, J.; Kawamura, M.; Ko, H.-Y.; Kokalj, A.; Küçükbenli, E.; Lazzeri, M.; Marsili, M.; Marzari, N.; Mauri, F.; Nguyen, N. L.; Nguyen, H.-V.; Otero-de-la-Roza, A.; Paulatto, L.; Poncé, S.; Rocca, D.; Sabatini, R.; Santra, B.; Schlipf, M.; Seitsonen, A. P.; Smogunov, A.; Timrov, I.; Thonhauser, T.; Umari, P.; Vast, N.; Wu, X.; Baroni, S.
2017-11-01
Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.
Advanced capabilities for materials modelling with Quantum ESPRESSO.
Giannozzi, P; Andreussi, O; Brumme, T; Bunau, O; Buongiorno Nardelli, M; Calandra, M; Car, R; Cavazzoni, C; Ceresoli, D; Cococcioni, M; Colonna, N; Carnimeo, I; Dal Corso, A; de Gironcoli, S; Delugas, P; DiStasio, R A; Ferretti, A; Floris, A; Fratesi, G; Fugallo, G; Gebauer, R; Gerstmann, U; Giustino, F; Gorni, T; Jia, J; Kawamura, M; Ko, H-Y; Kokalj, A; Küçükbenli, E; Lazzeri, M; Marsili, M; Marzari, N; Mauri, F; Nguyen, N L; Nguyen, H-V; Otero-de-la-Roza, A; Paulatto, L; Poncé, S; Rocca, D; Sabatini, R; Santra, B; Schlipf, M; Seitsonen, A P; Smogunov, A; Timrov, I; Thonhauser, T; Umari, P; Vast, N; Wu, X; Baroni, S
2017-10-24
Quantum EXPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the-art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudopotential and projector-augmented-wave approaches. Quantum EXPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement their ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software.
Advanced capabilities for materials modelling with Quantum ESPRESSO.
Andreussi, Oliviero; Brumme, Thomas; Bunau, Oana; Buongiorno Nardelli, Marco; Calandra, Matteo; Car, Roberto; Cavazzoni, Carlo; Ceresoli, Davide; Cococcioni, Matteo; Colonna, Nicola; Carnimeo, Ivan; Dal Corso, Andrea; de Gironcoli, Stefano; Delugas, Pietro; DiStasio, Robert; Ferretti, Andrea; Floris, Andrea; Fratesi, Guido; Fugallo, Giorgia; Gebauer, Ralph; Gerstmann, Uwe; Giustino, Feliciano; Gorni, Tommaso; Jia, Junteng; Kawamura, Mitsuaki; Ko, Hsin-Yu; Kokalj, Anton; Küçükbenli, Emine; Lazzeri, Michele; Marsili, Margherita; Marzari, Nicola; Mauri, Francesco; Nguyen, Ngoc Linh; Nguyen, Huy-Viet; Otero-de-la-Roza, Alberto; Paulatto, Lorenzo; Poncé, Samuel; Giannozzi, Paolo; Rocca, Dario; Sabatini, Riccardo; Santra, Biswajit; Schlipf, Martin; Seitsonen, Ari Paavo; Smogunov, Alexander; Timrov, Iurii; Thonhauser, Timo; Umari, Paolo; Vast, Nathalie; Wu, Xifan; Baroni, Stefano
2017-09-27
Quantum ESPRESSO is an integrated suite of open-source computer codes for quantum simulations of materials using state-of-the art electronic-structure techniques, based on density-functional theory, density-functional perturbation theory, and many-body perturbation theory, within the plane-wave pseudo-potential and projector-augmented-wave approaches. Quantum ESPRESSO owes its popularity to the wide variety of properties and processes it allows to simulate, to its performance on an increasingly broad array of hardware architectures, and to a community of researchers that rely on its capabilities as a core open-source development platform to implement theirs ideas. In this paper we describe recent extensions and improvements, covering new methodologies and property calculators, improved parallelization, code modularization, and extended interoperability both within the distribution and with external software. © 2017 IOP Publishing Ltd.
Hartland, William; Biddle, Chuck; Fallacaro, Michael
2008-06-01
This article explores the application of Paivio's Dual Coding Theory (DCT) as a scientifically sound rationale for the effects of multimedia learning in programs of nurse anesthesia. We explore and highlight this theory as a practical infrastructure for programs that work with dispersed students (ie, distance education models). Exploring the work of Paivio and others, we are engaged in an ongoing outcome study using audiovisual teaching interventions (SBVTIs) that we have applied to a range of healthcare providers in a quasiexperimental model. The early results of that study are reported in this article. In addition, we have observed powerful and sustained learning in a wide range of healthcare providers with our SBVTIs and suggest that this is likely explained by DCT.
The random energy model in a magnetic field and joint source channel coding
NASA Astrophysics Data System (ADS)
Merhav, Neri
2008-09-01
We demonstrate that there is an intimate relationship between the magnetic properties of Derrida’s random energy model (REM) of spin glasses and the problem of joint source-channel coding in Information Theory. In particular, typical patterns of erroneously decoded messages in the coding problem have “magnetization” properties that are analogous to those of the REM in certain phases, where the non-uniformity of the distribution of the source in the coding problem plays the role of an external magnetic field applied to the REM. We also relate the ensemble performance (random coding exponents) of joint source-channel codes to the free energy of the REM in its different phases.
Advanced technology development for image gathering, coding, and processing
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1990-01-01
Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.
Treating electron transport in MCNP{sup trademark}
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, H.G.
1996-12-31
The transport of electrons and other charged particles is fundamentally different from that of neutrons and photons. A neutron, in aluminum slowing down from 0.5 MeV to 0.0625 MeV will have about 30 collisions; a photon will have fewer than ten. An electron with the same energy loss will undergo 10{sup 5} individual interactions. This great increase in computational complexity makes a single- collision Monte Carlo approach to electron transport unfeasible for many situations of practical interest. Considerable theoretical work has been done to develop a variety of analytic and semi-analytic multiple-scattering theories for the transport of charged particles. Themore » theories used in the algorithms in MCNP are the Goudsmit-Saunderson theory for angular deflections, the Landau an theory of energy-loss fluctuations, and the Blunck-Leisegang enhancements of the Landau theory. In order to follow an electron through a significant energy loss, it is necessary to break the electron`s path into many steps. These steps are chosen to be long enough to encompass many collisions (so that multiple-scattering theories are valid) but short enough that the mean energy loss in any one step is small (for the approximations in the multiple-scattering theories). The energy loss and angular deflection of the electron during each step can then be sampled from probability distributions based on the appropriate multiple- scattering theories. This subsumption of the effects of many individual collisions into single steps that are sampled probabilistically constitutes the ``condensed history`` Monte Carlo method. This method is exemplified in the ETRAN series of electron/photon transport codes. The ETRAN codes are also the basis for the Integrated TIGER Series, a system of general-purpose, application-oriented electron/photon transport codes. The electron physics in MCNP is similar to that of the Integrated TIGER Series.« less
Atmospheric Dispersion Capability for T2VOC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.
2005-09-19
Atmospheric transport by variable-K theory dispersion has been added to T2VOC. The new code, T2VOCA, models flow and transport in the subsurface identically to T2VOC, but includes also the capability for modeling passive multicomponent variable-K theory dispersion in an atmospheric region assumed to be flat, horizontal, and with a logarithmic wind profile. The specification of the logarithmic wind profile in the T2VOC input file is automated through the use of a build code called ATMDISPV. The new capability is demonstrated on 2-D and 3-D example problems described in this report.
Composite Nanomechanics: A Mechanistic Properties Prediction
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Handler, Louis M.; Manderscheid, Jane M.
2007-01-01
A unique mechanistic theory is described to predict the properties of nanocomposites. The theory is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations hav e been programmed in a computer code. That computer code is used to predict 25 properties of a mononanofiber laminate. The results are pr esented graphically and discussed with respect to their practical sig nificance. Most of the results show smooth distributions. Results for matrix-dependent properties show bimodal through-the-thickness distr ibution with discontinuous changes from mode to mode.
Construction of self-dual codes in the Rosenbloom-Tsfasman metric
NASA Astrophysics Data System (ADS)
Krisnawati, Vira Hari; Nisa, Anzi Lina Ukhtin
2017-12-01
Linear code is a very basic code and very useful in coding theory. Generally, linear code is a code over finite field in Hamming metric. Among the most interesting families of codes, the family of self-dual code is a very important one, because it is the best known error-correcting code. The concept of Hamming metric is develop into Rosenbloom-Tsfasman metric (RT-metric). The inner product in RT-metric is different from Euclid inner product that is used to define duality in Hamming metric. Most of the codes which are self-dual in Hamming metric are not so in RT-metric. And, generator matrix is very important to construct a code because it contains basis of the code. Therefore in this paper, we give some theorems and methods to construct self-dual codes in RT-metric by considering properties of the inner product and generator matrix. Also, we illustrate some examples for every kind of the construction.
Interactive boundary-layer calculations of a transonic wing flow
NASA Technical Reports Server (NTRS)
Kaups, Kalle; Cebeci, Tuncer; Mehta, Unmeel
1989-01-01
Results obtained from iterative solutions of inviscid and boundary-layer equations are presented and compared with experimental values. The calculated results were obtained with an Euler code and a transonic potential code in order to furnish solutions for the inviscid flow; they were interacted with solutions of two-dimensional boundary-layer equations having a strip-theory approximation. Euler code results are found to be in better agreement with the experimental data than with the full potential code, especially in the presence of shock waves, (with the sole exception of the near-tip region).
From theory to practice: integrating instructional technology into veterinary medical education.
Wang, Hong; Rush, Bonnie R; Wilkerson, Melinda; Herman, Cheryl; Miesner, Matt; Renter, David; Gehring, Ronette
2013-01-01
Technology has changed the landscape of teaching and learning. The integration of instructional technology into teaching for meaningful learning is an issue for all educators to consider. In this article, we introduce educational theories including constructivism, information-processing theory, and dual-coding theory, along with the seven principles of good practice in undergraduate education. We also discuss five practical instructional strategies and the relationship of these strategies to the educational theories. From theory to practice, the purpose of the article is to share our application of educational theory and practice to work toward more innovative teaching in veterinary medical education.
Andrade, Xavier; Aspuru-Guzik, Alán
2013-10-08
We discuss the application of graphical processing units (GPUs) to accelerate real-space density functional theory (DFT) calculations. To make our implementation efficient, we have developed a scheme to expose the data parallelism available in the DFT approach; this is applied to the different procedures required for a real-space DFT calculation. We present results for current-generation GPUs from AMD and Nvidia, which show that our scheme, implemented in the free code Octopus, can reach a sustained performance of up to 90 GFlops for a single GPU, representing a significant speed-up when compared to the CPU version of the code. Moreover, for some systems, our implementation can outperform a GPU Gaussian basis set code, showing that the real-space approach is a competitive alternative for DFT simulations on GPUs.
Criticality Calculations with MCNP6 - Practical Lectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise
2016-11-29
These slides are used to teach MCNP (Monte Carlo N-Particle) usage to nuclear criticality safety analysts. The following are the lecture topics: course information, introduction, MCNP basics, criticality calculations, advanced geometry, tallies, adjoint-weighted tallies and sensitivities, physics and nuclear data, parameter studies, NCS validation I, NCS validation II, NCS validation III, case study 1 - solution tanks, case study 2 - fuel vault, case study 3 - B&W core, case study 4 - simple TRIGA, case study 5 - fissile mat. vault, criticality accident alarm systems. After completion of this course, you should be able to: Develop an input modelmore » for MCNP; Describe how cross section data impact Monte Carlo and deterministic codes; Describe the importance of validation of computer codes and how it is accomplished; Describe the methodology supporting Monte Carlo codes and deterministic codes; Describe pitfalls of Monte Carlo calculations; Discuss the strengths and weaknesses of Monte Carlo and Discrete Ordinants codes; The diffusion theory model is not strictly valid for treating fissile systems in which neutron absorption, voids, and/or material boundaries are present. In the context of these limitations, identify a fissile system for which a diffusion theory solution would be adequate.« less
Mangling Expertise Using Post-Coding Analysis to Complexify Teacher Learning
ERIC Educational Resources Information Center
Mills, Tammy
2017-01-01
A recent movement in teacher education research encompasses working with and through theory. In response to the call from Jackson and Mazzei (2013) to use theory to think with data and use data to think with theory, the author hopes to portray the complexities of teacher learning by avoiding models of teacher learning and development that tend to…
Foundations for Ethical Standards and Codes: The Role of Moral Philosophy and Theory in Ethics
ERIC Educational Resources Information Center
Freeman, Stephen J.; Engels, Dennis W.; Altekruse, Michael K.
2004-01-01
Ethical practice is a concern for all who practice in the psychological, social, and behavioral sciences. A central problem is discerning what action is ethically correct in a particular situation. It has been said that there is nothing so practical as good theory, because theory can help counselors organize and integrate knowledge. It seems,…
ERIC Educational Resources Information Center
Smith, Adrina O.
2017-01-01
This study examined the latent factors of dialect variation as they relate to reading achievement of second grade students. Sociocultural theory, identity theories, and critical theory used against a metaphorical backdrop of a bundle of locks were used to illustrate the complexity of language variation and its effect on reading achievement within…
Imagery and Verbal Coding Approaches in Chinese Vocabulary Instruction
ERIC Educational Resources Information Center
Shen, Helen H.
2010-01-01
This study consists of two instructional experiments. Within the framework of dual coding theory, the study compares the learning effects of two instructional encoding methods used in Chinese vocabulary instruction among students learning beginning Chinese as a foreign language. One method uses verbal encoding only, and the other method uses…
Differential Cross Section Kinematics for 3-dimensional Transport Codes
NASA Technical Reports Server (NTRS)
Norbury, John W.; Dick, Frank
2008-01-01
In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.
An Application of Discrete Mathematics to Coding Theory.
ERIC Educational Resources Information Center
Donohoe, L. Joyce
1992-01-01
Presents a public-key cryptosystem application to introduce students to several topics in discrete mathematics. A computer algorithms using recursive methods is presented to solve a problem in which one person wants to send a coded message to a second person while keeping the message secret from a third person. (MDH)
A Code of Ethics for Democratic Leadership
ERIC Educational Resources Information Center
Molina, Ricardo; Klinker, JoAnn Franklin
2012-01-01
Democratic leadership rests on sacred values, awareness, judgement, motivation and courage. Four turning points in a 38-year school administrator's career revealed decision-making in problematic moments stemmed from values in a personal and professional code of ethics. Reflection on practice and theory added vocabulary and understanding to make…
A Coding Scheme to Analyse the Online Asynchronous Discussion Forums of University Students
ERIC Educational Resources Information Center
Biasutti, Michele
2017-01-01
The current study describes the development of a content analysis coding scheme to examine transcripts of online asynchronous discussion groups in higher education. The theoretical framework comprises the theories regarding knowledge construction in computer-supported collaborative learning (CSCL) based on a sociocultural perspective. The coding…
NASA Technical Reports Server (NTRS)
Pindera, Marek-Jerzy; Aboudi, Jacob
1998-01-01
The objective of this three-year project was to develop and deliver to NASA Lewis one-dimensional and two-dimensional higher-order theories, and related computer codes, for the analysis, optimization and design of cylindrical functionally graded materials/structural components for use in advanced aircraft engines (e.g., combustor linings, rotor disks, heat shields, blisk blades). To satisfy this objective, a quasi one-dimensional version of the higher-order theory, HOTCFGM-1D, and four computer codes based on this theory, for the analysis, design and optimization of cylindrical structural components functionally graded in the radial direction were developed. The theory is applicable to thin multi-phased composite shell/cylinders subjected to macroscopically axisymmetric thermomechanical and inertial loading applied uniformly along the axial direction such that the overall deformation is characterized by a constant average axial strain. The reinforcement phases are uniformly distributed in the axial and circumferential directions, and arbitrarily distributed in the radial direction, thereby allowing functional grading of the internal reinforcement in this direction.
Statistical mechanics of broadcast channels using low-density parity-check codes.
Nakamura, Kazutaka; Kabashima, Yoshiyuki; Morelos-Zaragoza, Robert; Saad, David
2003-03-01
We investigate the use of Gallager's low-density parity-check (LDPC) codes in a degraded broadcast channel, one of the fundamental models in network information theory. Combining linear codes is a standard technique in practical network communication schemes and is known to provide better performance than simple time sharing methods when algebraic codes are used. The statistical physics based analysis shows that the practical performance of the suggested method, achieved by employing the belief propagation algorithm, is superior to that of LDPC based time sharing codes while the best performance, when received transmissions are optimally decoded, is bounded by the time sharing limit.
Error-Rate Bounds for Coded PPM on a Poisson Channel
NASA Technical Reports Server (NTRS)
Moision, Bruce; Hamkins, Jon
2009-01-01
Equations for computing tight bounds on error rates for coded pulse-position modulation (PPM) on a Poisson channel at high signal-to-noise ratio have been derived. These equations and elements of the underlying theory are expected to be especially useful in designing codes for PPM optical communication systems. The equations and the underlying theory apply, more specifically, to a case in which a) At the transmitter, a linear outer code is concatenated with an inner code that includes an accumulator and a bit-to-PPM-symbol mapping (see figure) [this concatenation is known in the art as "accumulate-PPM" (abbreviated "APPM")]; b) The transmitted signal propagates on a memoryless binary-input Poisson channel; and c) At the receiver, near-maximum-likelihood (ML) decoding is effected through an iterative process. Such a coding/modulation/decoding scheme is a variation on the concept of turbo codes, which have complex structures, such that an exact analytical expression for the performance of a particular code is intractable. However, techniques for accurately estimating the performances of turbo codes have been developed. The performance of a typical turbo code includes (1) a "waterfall" region consisting of a steep decrease of error rate with increasing signal-to-noise ratio (SNR) at low to moderate SNR, and (2) an "error floor" region with a less steep decrease of error rate with increasing SNR at moderate to high SNR. The techniques used heretofore for estimating performance in the waterfall region have differed from those used for estimating performance in the error-floor region. For coded PPM, prior to the present derivations, equations for accurate prediction of the performance of coded PPM at high SNR did not exist, so that it was necessary to resort to time-consuming simulations in order to make such predictions. The present derivation makes it unnecessary to perform such time-consuming simulations.
Evaluation of the efficiency and fault density of software generated by code generators
NASA Technical Reports Server (NTRS)
Schreur, Barbara
1993-01-01
Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.
Rotor Performance at High Advance Ratio: Theory versus Test
NASA Technical Reports Server (NTRS)
Harris, Franklin D.
2008-01-01
Five analytical tools have been used to study rotor performance at high advance ratio. One is representative of autogyro rotor theory in 1934 and four are representative of helicopter rotor theory in 2008. The five theories are measured against three sets of well documented, full-scale, isolated rotor performance experiments. The major finding of this study is that the decades spent by many rotorcraft theoreticians to improve prediction of basic rotor aerodynamic performance has paid off. This payoff, illustrated by comparing the CAMRAD II comprehensive code and Wheatley & Bailey theory to H-34 test data, shows that rational rotor lift to drag ratios are now predictable. The 1934 theory predicted L/D ratios as high as 15. CAMRAD II predictions compared well with H-34 test data having L/D ratios more on the order of 7 to 9. However, the detailed examination of the selected codes compared to H-34 test data indicates that not one of the codes can predict to engineering accuracy above an advance ratio of 0.62 the control positions and shaft angle of attack required for a given lift. There is no full-scale rotor performance data available for advance ratios above 1.0 and extrapolation of currently available data to advance ratios on the order of 2.0 is unreasonable despite the needs of future rotorcraft. Therefore, it is recommended that an overly strong full-scale rotor blade set be obtained and tested in a suitable wind tunnel to at least an advance ratio of 2.5. A tail rotor from a Sikorsky CH-53 or other large single rotor helicopter should be adequate for this exploratory experiment.
Generalized Bezout's Theorem and its applications in coding theory
NASA Technical Reports Server (NTRS)
Berg, Gene A.; Feng, Gui-Liang; Rao, T. R. N.
1996-01-01
This paper presents a generalized Bezout theorem which can be used to determine a tighter lower bound of the number of distinct points of intersection of two or more curves for a large class of plane curves. A new approach to determine a lower bound on the minimum distance (and also the generalized Hamming weights) for algebraic-geometric codes defined from a class of plane curves is introduced, based on the generalized Bezout theorem. Examples of more efficient linear codes are constructed using the generalized Bezout theorem and the new approach. For d = 4, the linear codes constructed by the new construction are better than or equal to the known linear codes. For d greater than 5, these new codes are better than the known codes. The Klein code over GF(2(sup 3)) is also constructed.
The random coding bound is tight for the average code.
NASA Technical Reports Server (NTRS)
Gallager, R. G.
1973-01-01
The random coding bound of information theory provides a well-known upper bound to the probability of decoding error for the best code of a given rate and block length. The bound is constructed by upperbounding the average error probability over an ensemble of codes. The bound is known to give the correct exponential dependence of error probability on block length for transmission rates above the critical rate, but it gives an incorrect exponential dependence at rates below a second lower critical rate. Here we derive an asymptotic expression for the average error probability over the ensemble of codes used in the random coding bound. The result shows that the weakness of the random coding bound at rates below the second critical rate is due not to upperbounding the ensemble average, but rather to the fact that the best codes are much better than the average at low rates.
New nonbinary quantum codes with larger distance constructed from BCH codes over 𝔽q2
NASA Astrophysics Data System (ADS)
Xu, Gen; Li, Ruihu; Fu, Qiang; Ma, Yuena; Guo, Luobin
2017-03-01
This paper concentrates on construction of new nonbinary quantum error-correcting codes (QECCs) from three classes of narrow-sense imprimitive BCH codes over finite field 𝔽q2 (q ≥ 3 is an odd prime power). By a careful analysis on properties of cyclotomic cosets in defining set T of these BCH codes, the improved maximal designed distance of these narrow-sense imprimitive Hermitian dual-containing BCH codes is determined to be much larger than the result given according to Aly et al. [S. A. Aly, A. Klappenecker and P. K. Sarvepalli, IEEE Trans. Inf. Theory 53, 1183 (2007)] for each different code length. Thus families of new nonbinary QECCs are constructed, and the newly obtained QECCs have larger distance than those in previous literature.
Density matrix perturbation theory for magneto-optical response of periodic insulators
NASA Astrophysics Data System (ADS)
Lebedeva, Irina; Tokatly, Ilya; Rubio, Angel
2015-03-01
Density matrix perturbation theory offers an ideal theoretical framework for the description of response of solids to arbitrary electromagnetic fields. In particular, it allows to consider perturbations introduced by uniform electric and magnetic fields under periodic boundary conditions, though the corresponding potentials break the translational invariance of the Hamiltonian. We have implemented the density matrix perturbation theory in the open-source Octopus code on the basis of the efficient Sternheimer approach. The procedures for responses of different order to electromagnetic fields, including electric polarizability, orbital magnetic susceptibility and magneto-optical response, have been developed and tested by comparison with the results for finite systems and for wavefunction-based perturbation theory, which is already available in the code. Additional analysis of the orbital magneto-optical response is performed on the basis of analytical models. Symmetry limitations to observation of the magneto-optical response are discussed. The financial support from the Marie Curie Fellowship PIIF-GA-2012-326435 (RespSpatDisp) is gratefully acknowledged.
SIMINOFF, LAURA A.; STEP, MARY M.
2011-01-01
Many observational coding schemes have been offered to measure communication in health care settings. These schemes fall short of capturing multiple functions of communication among providers, patients, and other participants. After a brief review of observational communication coding, the authors present a comprehensive scheme for coding communication that is (a) grounded in communication theory, (b) accounts for instrumental and relational communication, and (c) captures important contextual features with tailored coding templates: the Siminoff Communication Content & Affect Program (SCCAP). To test SCCAP reliability and validity, the authors coded data from two communication studies. The SCCAP provided reliable measurement of communication variables including tailored content areas and observer ratings of speaker immediacy, affiliation, confirmation, and disconfirmation behaviors. PMID:21213170
NASA Technical Reports Server (NTRS)
Bittker, D. A.; Scullin, V. J.
1984-01-01
A general chemical kinetics code is described for complex, homogeneous ideal gas reactions in any chemical system. The main features of the GCKP84 code are flexibility, convenience, and speed of computation for many different reaction conditions. The code, which replaces the GCKP code published previously, solves numerically the differential equations for complex reaction in a batch system or one dimensional inviscid flow. It also solves numerically the nonlinear algebraic equations describing the well stirred reactor. A new state of the art numerical integration method is used for greatly increased speed in handling systems of stiff differential equations. The theory and the computer program, including details of input preparation and a guide to using the code are given.
CBT Specific Process in Exposure-Based Treatments: Initial Examination in a Pediatric OCD Sample
Benito, Kristen Grabill; Conelea, Christine; Garcia, Abbe M.; Freeman, Jennifer B.
2012-01-01
Cognitive-Behavioral theory and empirical support suggest that optimal activation of fear is a critical component for successful exposure treatment. Using this theory, we developed coding methodology for measuring CBT-specific process during exposure. We piloted this methodology in a sample of young children (N = 18) who previously received CBT as part of a randomized controlled trial. Results supported the preliminary reliability and predictive validity of coding variables with 12 week and 3 month treatment outcome data, generally showing results consistent with CBT theory. However, given our limited and restricted sample, additional testing is warranted. Measurement of CBT-specific process using this methodology may have implications for understanding mechanism of change in exposure-based treatments and for improving dissemination efforts through identification of therapist behaviors associated with improved outcome. PMID:22523609
SurfKin: an ab initio kinetic code for modeling surface reactions.
Le, Thong Nguyen-Minh; Liu, Bin; Huynh, Lam K
2014-10-05
In this article, we describe a C/C++ program called SurfKin (Surface Kinetics) to construct microkinetic mechanisms for modeling gas-surface reactions. Thermodynamic properties of reaction species are estimated based on density functional theory calculations and statistical mechanics. Rate constants for elementary steps (including adsorption, desorption, and chemical reactions on surfaces) are calculated using the classical collision theory and transition state theory. Methane decomposition and water-gas shift reaction on Ni(111) surface were chosen as test cases to validate the code implementations. The good agreement with literature data suggests this is a powerful tool to facilitate the analysis of complex reactions on surfaces, and thus it helps to effectively construct detailed microkinetic mechanisms for such surface reactions. SurfKin also opens a possibility for designing nanoscale model catalysts. Copyright © 2014 Wiley Periodicals, Inc.
Educating CPE supervisors: a grounded theory study.
Ragsdale, Judith R; Holloway, Elizabeth L; Ivy, Steven S
2009-01-01
This qualitative study was designed to cull the wisdom of CPE supervisors doing especially competent supervisory education and to develop a theory of CPE supervisory education. Grounded theory methodology included interviewing 11 supervisors and coding the data to identify themes. Four primary dimensions emerged along with a reciprocal core dimension, Supervisory Wisdom, which refers to work the supervisors do in terms of their continuing growth and development.
Coding Theory Information Theory and Radar
2005-01-01
the design and synthesis of artificial multiagent systems and for the understanding of human decision-making processes. This... altruism that may exist in a complex society. SGT derives its ability to account simultaneously for both group and individual interests from the structure of ...satisficing decision theory as a model of human decision mak- ing. 2 Multi-Attribute Decision Making Many decision problems involve the consideration of
A Transversely Isotropic Thermoelastic Theory
NASA Technical Reports Server (NTRS)
Arnold, S. M.
1989-01-01
A continuum theory is presented for representing the thermoelastic behavior of composites that can be idealized as transversely isotropic. This theory is consistent with anisotropic viscoplastic theories being developed presently at NASA Lewis Research Center. A multiaxial statement of the theory is presented, as well as plane stress and plane strain reductions. Experimental determination of the required material parameters and their theoretical constraints are discussed. Simple homogeneously stressed elements are examined to illustrate the effect of fiber orientation on the resulting strain distribution. Finally, the multiaxial stress-strain relations are expressed in matrix form to simplify and accelerate implementation of the theory into structural analysis codes.
Optimum Vessel Performance in Evolving Nonlinear Wave Fields
2012-11-01
TEMPEST , the new, nonlinear, time-domain ship motion code being developed by the Navy. Table of Contents Executive Summary i List of Figures iii...domain ship motion code TEMPEST . The radiation and diffraction forces in the level 3.0 version of TEMPEST will be computed by the body-exact strip theory...nonlinear responses of a ship to a seaway are being incorporated into version 3 of TEMPEST , the new, nonlinear, time-domain ship motion code that
New q-ary quantum MDS codes with distances bigger than q/2
NASA Astrophysics Data System (ADS)
He, Xianmang; Xu, Liqing; Chen, Hao
2016-07-01
The construction of quantum MDS codes has been studied by many authors. We refer to the table in page 1482 of (IEEE Trans Inf Theory 61(3):1474-1484, 2015) for known constructions. However, there have been constructed only a few q-ary quantum MDS [[n,n-2d+2,d
Learning by Doing: Teaching Decision Making through Building a Code of Ethics.
ERIC Educational Resources Information Center
Hawthorne, Mark D.
2001-01-01
Notes that applying abstract ethical principles to the practical business of building a code of applied ethics for a technical communication department teaches students that they share certain unarticulated or unconscious values that they can translate into ethical principles. Suggests that combining abstract theory with practical policy writing…
Secret Codes, Remainder Arithmetic, and Matrices.
ERIC Educational Resources Information Center
Peck, Lyman C.
This pamphlet is designed for use as enrichment material for able junior and senior high school students who are interested in mathematics. No more than a clear understanding of basic arithmetic is expected. Students are introduced to ideas from number theory and modern algebra by learning mathematical ways of coding and decoding secret messages.…
NASA Technical Reports Server (NTRS)
Morren, Sybil Huang
1991-01-01
Transonic flow of dense gases for two-dimensional, steady-state, flow over a NACA 0012 airfoil was predicted analytically. The computer code used to model the dense gas behavior was a modified version of Jameson's FL052 airfoil code. The modifications to the code enabled modeling the dense gas behavior near the saturated vapor curve and critical pressure region where the fundamental derivative, Gamma, is negative. This negative Gamma region is of interest because the nonclassical gas behavior such as formation and propagation of expansion shocks, and the disintegration of inadmissible compression shocks may exist. The results indicated that dense gases with undisturbed thermodynamic states in the negative Gamma region show a significant reduction in the extent of the transonic regime as compared to that predicted by the perfect gas theory. The results support existing theories and predictions of the nonclassical, dense gas behavior from previous investigations.
Levels of Syntactic Realization in Oral Reading.
ERIC Educational Resources Information Center
Brown, Eric
Two contrasting theories of reading are reviewed in light of recent research in psycholinguistics. A strictly "visual" model of fluent reading is contrasted with several mediational theories where auditory or articulatory coding is deemed necessary for comprehension. Surveying the research in visual information processing, oral reading,…
McLelland, Douglas; VanRullen, Rufin
2016-10-01
Several theories have been advanced to explain how cross-frequency coupling, the interaction of neuronal oscillations at different frequencies, could enable item multiplexing in neural systems. The communication-through-coherence theory proposes that phase-matching of gamma oscillations between areas enables selective processing of a single item at a time, and a later refinement of the theory includes a theta-frequency oscillation that provides a periodic reset of the system. Alternatively, the theta-gamma neural code theory proposes that a sequence of items is processed, one per gamma cycle, and that this sequence is repeated or updated across theta cycles. In short, both theories serve to segregate representations via the temporal domain, but differ on the number of objects concurrently represented. In this study, we set out to test whether each of these theories is actually physiologically plausible, by implementing them within a single model inspired by physiological data. Using a spiking network model of visual processing, we show that each of these theories is physiologically plausible and computationally useful. Both theories were implemented within a single network architecture, with two areas connected in a feedforward manner, and gamma oscillations generated by feedback inhibition within areas. Simply increasing the amplitude of global inhibition in the lower area, equivalent to an increase in the spatial scope of the gamma oscillation, yielded a switch from one mode to the other. Thus, these different processing modes may co-exist in the brain, enabling dynamic switching between exploratory and selective modes of attention.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.
1975-10-01
The computer code block VENTURE, designed to solve multigroup neutronics problems with application of the finite-difference diffusion-theory approximation to neutron transport (or alternatively simple P$sub 1$) in up to three- dimensional geometry is described. A variety of types of problems may be solved: the usual eigenvalue problem, a direct criticality search on the buckling, on a reciprocal velocity absorber (prompt mode), or on nuclide concentrations, or an indirect criticality search on nuclide concentrations, or on dimensions. First- order perturbation analysis capability is available at the macroscopic cross section level. (auth)
NASA Astrophysics Data System (ADS)
Habibi, Ali
1993-01-01
The objective of this article is to present a discussion on the future of image data compression in the next two decades. It is virtually impossible to predict with any degree of certainty the breakthroughs in theory and developments, the milestones in advancement of technology and the success of the upcoming commercial products in the market place which will be the main factors in establishing the future stage to image coding. What we propose to do, instead, is look back at the progress in image coding during the last two decades and assess the state of the art in image coding today. Then, by observing the trends in developments of theory, software, and hardware coupled with the future needs for use and dissemination of imagery data and the constraints on the bandwidth and capacity of various networks, predict the future state of image coding. What seems to be certain today is the growing need for bandwidth compression. The television is using a technology which is half a century old and is ready to be replaced by high definition television with an extremely high digital bandwidth. Smart telephones coupled with personal computers and TV monitors accommodating both printed and video data will be common in homes and businesses within the next decade. Efficient and compact digital processing modules using developing technologies will make bandwidth compressed imagery the cheap and preferred alternative in satellite and on-board applications. In view of the above needs, we expect increased activities in development of theory, software, special purpose chips and hardware for image bandwidth compression in the next two decades. The following sections summarize the future trends in these areas.
Streamlined Genome Sequence Compression using Distributed Source Coding
Wang, Shuang; Jiang, Xiaoqian; Chen, Feng; Cui, Lijuan; Cheng, Samuel
2014-01-01
We aim at developing a streamlined genome sequence compression algorithm to support alternative miniaturized sequencing devices, which have limited communication, storage, and computation power. Existing techniques that require heavy client (encoder side) cannot be applied. To tackle this challenge, we carefully examined distributed source coding theory and developed a customized reference-based genome compression protocol to meet the low-complexity need at the client side. Based on the variation between source and reference, our protocol will pick adaptively either syndrome coding or hash coding to compress subsequences of changing code length. Our experimental results showed promising performance of the proposed method when compared with the state-of-the-art algorithm (GRS). PMID:25520552
How Student Teachers (Don't) Talk about Race: An Intersectional Analysis
ERIC Educational Resources Information Center
Young, Kathryn S.
2016-01-01
This study explores how student teacher talk about their students illuminates the identities ascribed to these same students. It uses a hybrid intersectional framework based on Disability Studies, Critical Race Theory, and Latino Critical Theory and methodologies (like examining majoritarian stories, counter-storytelling, coded talk, and…
Wireless communication and their mathematics
NASA Astrophysics Data System (ADS)
Komaki, Shozo
2015-05-01
Mobile phone and smart phone are penetrating into social use. To develop these system, various type of theoretical works based on mathematics are done, such as radio propagation theory, traffic theory, security coding and wireless device etc. In this speech, I will mention about the related mathematics and problems in it.
Word Identification in Reading and the Promise of Subsymbolic Psycholinguistics.
ERIC Educational Resources Information Center
Van Orden, Guy C.; And Others
1990-01-01
It is argued that dual-process theory has misconstrued the correspondence between words' spelling and their phonology. A subsymbolic alternative to dual-processing theory is presented that includes a clear role for the process of phonologic coding. The subsymbolic approach is developed around a covariant learning hypothesis. (SLD)
Theory-Based Assessment in Environmental Education: A Tool for Formative Evaluation
ERIC Educational Resources Information Center
Granit-Dgani, Dafna; Kaplan, Avi; Flum, Hanoch
2017-01-01
This article reports on the development of a theory-informed assessment instrument for use in evaluating environmental education programs. The instrument involves coding learners' brief reflective writing on five established educational and social psychological constructs that correspond to five important goals of environmental education:…
ASCA Ethical Standards and the Relevance of Eastern Ethical Theories
ERIC Educational Resources Information Center
Cook, Amy L.; Houser, Rick A.
2009-01-01
As schools become increasingly diverse through immigration and growth of minority groups, it is important that school counselors incorporate culturally sensitive ethical decision-making in their practice. The use of Western ethical theories in the application of professional codes of ethics provides a specific perspective in ethical…
Baucom, Brian R W; Leo, Karena; Adamo, Colin; Georgiou, Panayiotis; Baucom, Katherine J W
2017-12-01
Observational behavioral coding methods are widely used for the study of relational phenomena. There are numerous guidelines for the development and implementation of these methods that include principles for creating new and adapting existing coding systems as well as principles for creating coding teams. While these principles have been successfully implemented in research on relational phenomena, the ever expanding array of phenomena being investigated with observational methods calls for a similar expansion of these principles. Specifically, guidelines are needed for decisions that arise in current areas of emphasis in couple research including observational investigation of related outcomes (e.g., relationship distress and psychological symptoms), the study of change in behavior over time, and the study of group similarities and differences in the enactment and perception of behavior. This article describes conceptual and statistical considerations involved in these 3 areas of research and presents principle- and empirically based rationale for design decisions related to these issues. A unifying principle underlying these guidelines is the need for careful consideration of fit between theory, research questions, selection of coding systems, and creation of coding teams. Implications of (mis)fit for the advancement of theory are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
ICAN Computer Code Adapted for Building Materials
NASA Technical Reports Server (NTRS)
Murthy, Pappu L. N.
1997-01-01
The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.
Subspace Arrangement Codes and Cryptosystems
2011-05-09
any other prov1sion of law, no person shall be subject to any penalty for failing to comply w1th a collection of information if it does not display a...NUMBER OF PAGES 49 19a. NAME OF RESPONSIBLE PERSON 19b. TELEPHONE NUMBER (Include area code) Standard Form 298 (Rev. 8/98) Prescribed by ANSI Std...theory is finding codes that have a small number of digits (length) with a high number codewords (dimension), as well as good error-correction properties
Experimental program for real gas flow code validation at NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Deiwert, George S.; Strawa, Anthony W.; Sharma, Surendra P.; Park, Chul
1989-01-01
The experimental program for validating real gas hypersonic flow codes at NASA Ames Rsearch Center is described. Ground-based test facilities used include ballistic ranges, shock tubes and shock tunnels, arc jet facilities and heated-air hypersonic wind tunnels. Also included are large-scale computer systems for kinetic theory simulations and benchmark code solutions. Flight tests consist of the Aeroassist Flight Experiment, the Space Shuttle, Project Fire 2, and planetary probes such as Galileo, Pioneer Venus, and PAET.
Interleaved concatenated codes: new perspectives on approaching the Shannon limit.
Viterbi, A J; Viterbi, A M; Sindhushayana, N T
1997-09-02
The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit.
1982-11-01
Service code exceeded operational code in the ratio of 10 : I. No redundant information was required. It was modular. Internal parts of the program...to NASA’s analyses. We were to try to find an existing finite element program of a quality that would be worth recommending to all NASA Centers. We...Distinct manuals were published for users, programmers, theory, and demonstration problems. 3 It abounded with service code to provide user conveniences
The application of coded excitation technology in medical ultrasonic Doppler imaging
NASA Astrophysics Data System (ADS)
Li, Weifeng; Chen, Xiaodong; Bao, Jing; Yu, Daoyin
2008-03-01
Medical ultrasonic Doppler imaging is one of the most important domains of modern medical imaging technology. The application of coded excitation technology in medical ultrasonic Doppler imaging system has the potential of higher SNR and deeper penetration depth than conventional pulse-echo imaging system, it also improves the image quality, and enhances the sensitivity of feeble signal, furthermore, proper coded excitation is beneficial to received spectrum of Doppler signal. Firstly, this paper analyzes the application of coded excitation technology in medical ultrasonic Doppler imaging system abstractly, showing the advantage and bright future of coded excitation technology, then introduces the principle and the theory of coded excitation. Secondly, we compare some coded serials (including Chirp and fake Chirp signal, Barker codes, Golay's complementary serial, M-sequence, etc). Considering Mainlobe Width, Range Sidelobe Level, Signal-to-Noise Ratio and sensitivity of Doppler signal, we choose Barker codes as coded serial. At last, we design the coded excitation circuit. The result in B-mode imaging and Doppler flow measurement coincided with our expectation, which incarnated the advantage of application of coded excitation technology in Digital Medical Ultrasonic Doppler Endoscope Imaging System.
A Clustering-Based Approach to Enriching Code Foraging Environment.
Niu, Nan; Jin, Xiaoyu; Niu, Zhendong; Cheng, Jing-Ru C; Li, Ling; Kataev, Mikhail Yu
2016-09-01
Developers often spend valuable time navigating and seeking relevant code in software maintenance. Currently, there is a lack of theoretical foundations to guide tool design and evaluation to best shape the code base to developers. This paper contributes a unified code navigation theory in light of the optimal food-foraging principles. We further develop a novel framework for automatically assessing the foraging mechanisms in the context of program investigation. We use the framework to examine to what extent the clustering of software entities affects code foraging. Our quantitative analysis of long-lived open-source projects suggests that clustering enriches the software environment and improves foraging efficiency. Our qualitative inquiry reveals concrete insights into real developer's behavior. Our research opens the avenue toward building a new set of ecologically valid code navigation tools.
Rangachari, Pavani
2008-01-01
CONTEXT/PURPOSE: With the growing momentum toward hospital quality measurement and reporting by public and private health care payers, hospitals face increasing pressures to improve their medical record documentation and administrative data coding accuracy. This study explores the relationship between the organizational knowledge-sharing structure related to quality and hospital coding accuracy for quality measurement. Simultaneously, this study seeks to identify other leadership/management characteristics associated with coding for quality measurement. Drawing upon complexity theory, the literature on "professional complex systems" has put forth various strategies for managing change and turnaround in professional organizations. In so doing, it has emphasized the importance of knowledge creation and organizational learning through interdisciplinary networks. This study integrates complexity, network structure, and "subgoals" theories to develop a framework for knowledge-sharing network effectiveness in professional complex systems. This framework is used to design an exploratory and comparative research study. The sample consists of 4 hospitals, 2 showing "good coding" accuracy for quality measurement and 2 showing "poor coding" accuracy. Interviews and surveys are conducted with administrators and staff in the quality, medical staff, and coding subgroups in each facility. Findings of this study indicate that good coding performance is systematically associated with a knowledge-sharing network structure rich in brokerage and hierarchy (with leaders connecting different professional subgroups to each other and to the external environment), rather than in density (where everyone is directly connected to everyone else). It also implies that for the hospital organization to adapt to the changing environment of quality transparency, senior leaders must undertake proactive and unceasing efforts to coordinate knowledge exchange across physician and coding subgroups and connect these subgroups with the changing external environment.
NASA Astrophysics Data System (ADS)
Gao, Shanghua; Fu, Guangyu; Liu, Tai; Zhang, Guoqing
2017-03-01
Tanaka et al. (Geophys J Int 164:273-289, 2006, Geophys J Int 170:1031-1052, 2007) proposed the spherical dislocation theory (SDT) in a spherically symmetric, self-gravitating visco-elastic earth model. However, to date there have been no reports on easily adopted, widely used software that utilizes Tanaka's theory. In this study we introduce a new code to compute post-seismic deformations (PSD), including displacements as well as Geoid and gravity changes, caused by a seismic source at any position. This new code is based on the above-mentioned SDT. The code consists of two parts. The first part is the numerical frame of the dislocation Green function (DGF), which contains a set of two-dimensional discrete numerical frames of DGFs on a symmetric earth model. The second part is an integration function, which performs bi-quadratic spline interpolation operations on the frame of DGFs. The inputs are the information on the seismic fault models and the information on the observation points. After the user prepares the inputs in a file with given format, the code will automatically compute the PSD. As an example, we use the new code to calculate the co-seismic displacements caused by the Tohoku-Oki Mw 9.0 earthquake. We compare the result with observations and the result from a full-elastic SDT, and we found that the Root Mean Square error between the calculated and observed results is 7.4 cm. This verifies the suitability of our new code. Finally, we discuss several issues that require attention when using the code, which should be helpful for users.
An active inference theory of allostasis and interoception in depression
Quigley, Karen S.; Hamilton, Paul
2016-01-01
In this paper, we integrate recent theoretical and empirical developments in predictive coding and active inference accounts of interoception (including the Embodied Predictive Interoception Coding model) with working hypotheses from the theory of constructed emotion to propose a biologically plausible unified theory of the mind that places metabolism and energy regulation (i.e. allostasis), as well as the sensory consequences of that regulation (i.e. interoception), at its core. We then consider the implications of this approach for understanding depression. We speculate that depression is a disorder of allostasis, whose myriad symptoms result from a ‘locked in’ brain that is relatively insensitive to its sensory context. We conclude with a brief discussion of the ways our approach might reveal new insights for the treatment of depression. This article is part of the themed issue ‘Interoception beyond homeostasis: affect, cognition and mental health’. PMID:28080969
The Effects of Single and Dual Coded Multimedia Instructional Methods on Chinese Character Learning
ERIC Educational Resources Information Center
Wang, Ling
2013-01-01
Learning Chinese characters is a difficult task for adult English native speakers due to the significant differences between the Chinese and English writing system. The visuospatial properties of Chinese characters have inspired the development of instructional methods using both verbal and visual information based on the Dual Coding Theory. This…
ERIC Educational Resources Information Center
Ferran, C.; Bosch, S.; Carnicer, A.
2012-01-01
A practical activity designed to introduce wavefront coding techniques as a method to extend the depth of field in optical systems is presented. The activity is suitable for advanced undergraduate students since it combines different topics in optical engineering such as optical system design, aberration theory, Fourier optics, and digital image…
ERIC Educational Resources Information Center
Owusu-Agyeman, Yaw; Larbi-Siaw, Otu
2017-01-01
This study argues that in developing a robust framework for students in a blended learning environment, Structural Alignment (SA) becomes the third principle of specialisation in addition to Epistemic Relation (ER) and Social Relation (SR). We provide an extended code: (ER+/-, SR+/-, SA+/-) that present strong classification and framing to the…
EAC: A program for the error analysis of STAGS results for plates
NASA Technical Reports Server (NTRS)
Sistla, Rajaram; Thurston, Gaylen A.; Bains, Nancy Jane C.
1989-01-01
A computer code is now available for estimating the error in results from the STAGS finite element code for a shell unit consisting of a rectangular orthotropic plate. This memorandum contains basic information about the computer code EAC (Error Analysis and Correction) and describes the connection between the input data for the STAGS shell units and the input data necessary to run the error analysis code. The STAGS code returns a set of nodal displacements and a discrete set of stress resultants; the EAC code returns a continuous solution for displacements and stress resultants. The continuous solution is defined by a set of generalized coordinates computed in EAC. The theory and the assumptions that determine the continuous solution are also outlined in this memorandum. An example of application of the code is presented and instructions on its usage on the Cyber and the VAX machines have been provided.
A review of predictive coding algorithms.
Spratling, M W
2017-03-01
Predictive coding is a leading theory of how the brain performs probabilistic inference. However, there are a number of distinct algorithms which are described by the term "predictive coding". This article provides a concise review of these different predictive coding algorithms, highlighting their similarities and differences. Five algorithms are covered: linear predictive coding which has a long and influential history in the signal processing literature; the first neuroscience-related application of predictive coding to explaining the function of the retina; and three versions of predictive coding that have been proposed to model cortical function. While all these algorithms aim to fit a generative model to sensory data, they differ in the type of generative model they employ, in the process used to optimise the fit between the model and sensory data, and in the way that they are related to neurobiology. Copyright © 2016 Elsevier Inc. All rights reserved.
Improvement of Mishchenko's T-matrix code for absorbing particles.
Moroz, Alexander
2005-06-10
The use of Gaussian elimination with backsubstitution for matrix inversion in scattering theories is discussed. Within the framework of the T-matrix method (the state-of-the-art code by Mishchenko is freely available at http://www.giss.nasa.gov/-crmim), it is shown that the domain of applicability of Mishchenko's FORTRAN 77 (F77) code can be substantially expanded in the direction of strongly absorbing particles where the current code fails to converge. Such an extension is especially important if the code is to be used in nanoplasmonic or nanophotonic applications involving metallic particles. At the same time, convergence can also be achieved for large nonabsorbing particles, in which case the non-Numerical Algorithms Group option of Mishchenko's code diverges. Computer F77 implementation of Mishchenko's code supplemented with Gaussian elimination with backsubstitution is freely available at http://www.wave-scattering.com.
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W.; Imel, Zac E.; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C.
2014-01-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. PMID:25242192
Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C
2015-02-01
The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.
Neural underpinnings of music: the polyrhythmic brain.
Vuust, Peter; Gebauer, Line K; Witek, Maria A G
2014-01-01
Musical rhythm, consisting of apparently abstract intervals of accented temporal events, has the remarkable ability to move our minds and bodies. Why do certain rhythms make us want to tap our feet, bop our heads or even get up and dance? And how does the brain process rhythmically complex rhythms during our experiences of music? In this chapter, we describe some common forms of rhythmic complexity in music and propose that the theory of predictive coding can explain how rhythm and rhythmic complexity are processed in the brain. We also consider how this theory may reveal why we feel so compelled by rhythmic tension in music. First, musical-theoretical and neuroscientific frameworks of rhythm are presented, in which rhythm perception is conceptualized as an interaction between what is heard ('rhythm') and the brain's anticipatory structuring of music ('the meter'). Second, three different examples of tension between rhythm and meter in music are described: syncopation, polyrhythm and groove. Third, we present the theory of predictive coding of music, which posits a hierarchical organization of brain responses reflecting fundamental, survival-related mechanisms associated with predicting future events. According to this theory, perception and learning is manifested through the brain's Bayesian minimization of the error between the input to the brain and the brain's prior expectations. Fourth, empirical studies of neural and behavioral effects of syncopation, polyrhythm and groove will be reported, and we propose how these studies can be seen as special cases of the predictive coding theory. Finally, we argue that musical rhythm exploits the brain's general principles of anticipation and propose that pleasure from musical rhythm may be a result of such anticipatory mechanisms.
Ross, Jaclyn M.; Girard, Jeffrey M.; Wright, Aidan G.C.; Beeney, Joseph E.; Scott, Lori N.; Hallquist, Michael N.; Lazarus, Sophie A.; Stepp, Stephanie D.; Pilkonis, Paul A.
2016-01-01
Relationships are among the most salient factors affecting happiness and wellbeing for individuals and families. Relationship science has identified the study of dyadic behavioral patterns between couple members during conflict as an important window in to relational functioning with both short-term and long-term consequences. Several methods have been developed for the momentary assessment of behavior during interpersonal transactions. Among these, the most popular is the Specific Affect Coding System (SPAFF), which organizes social behavior into a set of discrete behavioral constructs. This study examines the interpersonal meaning of the SPAFF codes through the lens of interpersonal theory, which uses the fundamental dimensions of Dominance and Affiliation to organize interpersonal behavior. A sample of 67 couples completed a conflict task, which was video recorded and coded using SPAFF and a method for rating momentary interpersonal behavior, the Continuous Assessment of Interpersonal Dynamics (CAID). Actor partner interdependence models in a multilevel structural equation modeling framework were used to study the covariation of SPAFF codes and CAID ratings. Results showed that a number of SPAFF codes had clear interpersonal signatures, but many did not. Additionally, actor and partner effects for the same codes were strongly consistent with interpersonal theory’s principle of complementarity. Thus, findings reveal points of convergence and divergence in the two systems and provide support for central tenets of interpersonal theory. Future directions based on these initial findings are discussed. PMID:27148786
NASA Astrophysics Data System (ADS)
Ratnam, Challa; Lakshmana Rao, Vadlamudi; Lachaa Goud, Sivagouni
2006-10-01
In the present paper, and a series of papers to follow, the Fourier analytical properties of multiple annuli coded aperture (MACA) and complementary multiple annuli coded aperture (CMACA) systems are investigated. First, the transmission function for MACA and CMACA is derived using Fourier methods and, based on the Fresnel-Kirchoff diffraction theory, the formulae for the point spread function are formulated. The PSF maxima and minima are calculated for both the MACA and CMACA systems. The dependence of these properties on the number of zones is studied and reported in this paper.
Distributed Joint Source-Channel Coding in Wireless Sensor Networks
Zhu, Xuqi; Liu, Yu; Zhang, Lin
2009-01-01
Considering the fact that sensors are energy-limited and the wireless channel conditions in wireless sensor networks, there is an urgent need for a low-complexity coding method with high compression ratio and noise-resisted features. This paper reviews the progress made in distributed joint source-channel coding which can address this issue. The main existing deployments, from the theory to practice, of distributed joint source-channel coding over the independent channels, the multiple access channels and the broadcast channels are introduced, respectively. To this end, we also present a practical scheme for compressing multiple correlated sources over the independent channels. The simulation results demonstrate the desired efficiency. PMID:22408560
Definite Integrals, Some Involving Residue Theory Evaluated by Maple Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, Kimiko o
2010-01-01
The calculus of residue is applied to evaluate certain integrals in the range (-{infinity} to {infinity}) using the Maple symbolic code. These integrals are of the form {integral}{sub -{infinity}}{sup {infinity}} cos(x)/[(x{sup 2} + a{sup 2})(x{sup 2} + b{sup 2}) (x{sup 2} + c{sup 2})]dx and similar extensions. The Maple code is also applied to expressions in maximum likelihood estimator moments when sampling from the negative binomial distribution. In general the Maple code approach to the integrals gives correct answers to specified decimal places, but the symbolic result may be extremely long and complex.
Why hard-nosed executives should care about management theory.
Christensen, Clayton M; Raynor, Michael E
2003-09-01
Theory often gets a bum rap among managers because it's associated with the word "theoretical," which connotes "impractical." But it shouldn't. Because experience is solely about the past, solid theories are the only way managers can plan future actions with any degree of confidence. The key word here is "solid." Gravity is a solid theory. As such, it lets us predict that if we step off a cliff we will fall, without actually having to do so. But business literature is replete with theories that don't seem to work in practice or actually contradict each other. How can a manager tell a good business theory from a bad one? The first step is understanding how good theories are built. They develop in three stages: gathering data, organizing it into categories highlighting significant differences, then making generalizations explaining what causes what, under which circumstances. For instance, professor Ananth Raman and his colleagues collected data showing that bar code-scanning systems generated notoriously inaccurate inventory records. These observations led them to classify the types of errors the scanning systems produced and the types of shops in which those errors most often occurred. Recently, some of Raman's doctoral students have worked as clerks to see exactly what kinds of behavior cause the errors. From this foundation, a solid theory predicting under which circumstances bar code systems work, and don't work, is beginning to emerge. Once we forgo one-size-fits-all explanations and insist that a theory describes the circumstances under which it does and doesn't work, we can bring predictable success to the world of management.
Airborne antenna radiation pattern code user's manual
NASA Technical Reports Server (NTRS)
Burnside, Walter D.; Kim, Jacob J.; Grandchamp, Brett; Rojas, Roberto G.; Law, Philip
1985-01-01
The use of a newly developed computer code to analyze the radiation patterns of antennas mounted on a ellipsoid and in the presence of a set of finite flat plates is described. It is shown how the code allows the user to simulate a wide variety of complex electromagnetic radiation problems using the ellipsoid/plates model. The code has the capacity of calculating radiation patterns around an arbitrary conical cut specified by the user. The organization of the code, definition of input and output data, and numerous practical examples are also presented. The analysis is based on the Uniform Geometrical Theory of Diffraction (UTD), and most of the computed patterns are compared with experimental results to show the accuracy of this solution.
Flexible digital modulation and coding synthesis for satellite communications
NASA Technical Reports Server (NTRS)
Vanderaar, Mark; Budinger, James; Hoerig, Craig; Tague, John
1991-01-01
An architecture and a hardware prototype of a flexible trellis modem/codec (FTMC) transmitter are presented. The theory of operation is built upon a pragmatic approach to trellis-coded modulation that emphasizes power and spectral efficiency. The system incorporates programmable modulation formats, variations of trellis-coding, digital baseband pulse-shaping, and digital channel precompensation. The modulation formats examined include (uncoded and coded) binary phase shift keying (BPSK), quatenary phase shift keying (QPSK), octal phase shift keying (8PSK), 16-ary quadrature amplitude modulation (16-QAM), and quadrature quadrature phase shift keying (Q squared PSK) at programmable rates up to 20 megabits per second (Mbps). The FTMC is part of the developing test bed to quantify modulation and coding concepts.
Statistical computation of tolerance limits
NASA Technical Reports Server (NTRS)
Wheeler, J. T.
1993-01-01
Based on a new theory, two computer codes were developed specifically to calculate the exact statistical tolerance limits for normal distributions within unknown means and variances for the one-sided and two-sided cases for the tolerance factor, k. The quantity k is defined equivalently in terms of the noncentral t-distribution by the probability equation. Two of the four mathematical methods employ the theory developed for the numerical simulation. Several algorithms for numerically integrating and iteratively root-solving the working equations are written to augment the program simulation. The program codes generate some tables of k's associated with the varying values of the proportion and sample size for each given probability to show accuracy obtained for small sample sizes.
Track-structure simulations for charged particles.
Dingfelder, Michael
2012-11-01
Monte Carlo track-structure simulations provide a detailed and accurate picture of radiation transport of charged particles through condensed matter of biological interest. Liquid water serves as a surrogate for soft tissue and is used in most Monte Carlo track-structure codes. Basic theories of radiation transport and track-structure simulations are discussed and differences compared to condensed history codes highlighted. Interaction cross sections for electrons, protons, alpha particles, and light and heavy ions are required input data for track-structure simulations. Different calculation methods, including the plane-wave Born approximation, the dielectric theory, and semi-empirical approaches are presented using liquid water as a target. Low-energy electron transport and light ion transport are discussed as areas of special interest.
Horizontal axis wind turbine post stall airfoil characteristics synthesization
NASA Technical Reports Server (NTRS)
Tangler, James L.; Ostowari, Cyrus
1995-01-01
Blade-element/momentum performance prediction codes are routinely used for wind turbine design and analysis. A weakness of these codes is their inability to consistently predict peak power upon which the machine structural design and cost are strongly dependent. The purpose of this study was to compare post-stall airfoil characteristics synthesization theory to a systematically acquired wind tunnel data set in which the effects of aspect ratio, airfoil thickness, and Reynolds number were investigated. The results of this comparison identified discrepancies between current theory and the wind tunnel data which could not be resolved. Other factors not previously investigated may account for these discrepancies and have a significant effect on peak power prediction.
COLA with scale-dependent growth: applications to screened modified gravity models
NASA Astrophysics Data System (ADS)
Winther, Hans A.; Koyama, Kazuya; Manera, Marc; Wright, Bill S.; Zhao, Gong-Bo
2017-08-01
We present a general parallelized and easy-to-use code to perform numerical simulations of structure formation using the COLA (COmoving Lagrangian Acceleration) method for cosmological models that exhibit scale-dependent growth at the level of first and second order Lagrangian perturbation theory. For modified gravity theories we also include screening using a fast approximate method that covers all the main examples of screening mechanisms in the literature. We test the code by comparing it to full simulations of two popular modified gravity models, namely f(R) gravity and nDGP, and find good agreement in the modified gravity boost-factors relative to ΛCDM even when using a fairly small number of COLA time steps.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Winther, Hans A.; Koyama, Kazuya; Wright, Bill S.
We present a general parallelized and easy-to-use code to perform numerical simulations of structure formation using the COLA (COmoving Lagrangian Acceleration) method for cosmological models that exhibit scale-dependent growth at the level of first and second order Lagrangian perturbation theory. For modified gravity theories we also include screening using a fast approximate method that covers all the main examples of screening mechanisms in the literature. We test the code by comparing it to full simulations of two popular modified gravity models, namely f ( R ) gravity and nDGP, and find good agreement in the modified gravity boost-factors relative tomore » ΛCDM even when using a fairly small number of COLA time steps.« less
INHYD: Computer code for intraply hybrid composite design. A users manual
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Sinclair, J. H.
1983-01-01
A computer program (INHYD) was developed for intraply hybrid composite design. A users manual for INHYD is presented. In INHYD embodies several composite micromechanics theories, intraply hybrid composite theories, and an integrated hygrothermomechanical theory. The INHYD can be run in both interactive and batch modes. It has considerable flexibility and capability, which the user can exercise through several options. These options are demonstrated through appropriate INHYD runs in the manual.
Binary Sequences for Spread-Spectrum Multiple-Access Communication
1977-08-01
Massey, J. L., and Uhran, J. J., Jr., "Sub-baud coding," Proceedings of the Thirteenth Annual Allerton Conference on Circuit and System Theory, pp. 539...sequences in a multipl.e access environment," Proceedings of the Thirteenth Annual AIlerton Conference on Circuit and System Theory, pp. 21-27, October...34 Proceedings of the Thirteenth Annual Allertcn Conference on Circuit and System Theory, pp. 548-559, October 1975. Yao, K., *Performance bounds on
Cracking Silent Codes: Critical Race Theory and Education Organizing
ERIC Educational Resources Information Center
Su, Celina
2007-01-01
Critical race theory (CRT) has moved beyond legal scholarship to critique the ways in which "colorblind" laws and policies perpetuate existing racial inequalities in education policy. While criticisms of CRT have focused on the pessimism and lack of remedies presented, CRT scholars have begun to address issues of praxis. Specifically,…
A Note on Powers in Finite Fields
ERIC Educational Resources Information Center
Aabrandt, Andreas; Hansen, Vagn Lundsgaard
2016-01-01
The study of solutions to polynomial equations over finite fields has a long history in mathematics and is an interesting area of contemporary research. In recent years, the subject has found important applications in the modelling of problems from applied mathematical fields such as signal analysis, system theory, coding theory and cryptology. In…
ERIC Educational Resources Information Center
Emanouilidis, Emanuel
2005-01-01
Latin squares have existed for hundreds of years but it wasn't until rather recently that Latin squares were used in other areas such as statistics, graph theory, coding theory and the generation of random numbers as well as in the design and analysis of experiments. This note describes Latin and diagonal Latin squares, a method of constructing…
ERIC Educational Resources Information Center
Emanouilidis, Emanuel
2008-01-01
Latin squares were first introduced and studied by the famous mathematician Leonhard Euler in the 1700s. Through the years, Latin squares have been used in areas such as statistics, graph theory, coding theory, the generation of random numbers as well as in the design and analysis of experiments. Recently, with the international popularity of…
NASA Technical Reports Server (NTRS)
Goldman, L. J.; Seasholtz, R. G.
1982-01-01
Experimental measurements of the velocity components in the blade to blade (axial tangential) plane were obtained with an axial flow turbine stator passage and were compared with calculations from three turbomachinery computer programs. The theoretical results were calculated from a quasi three dimensional inviscid code, a three dimensional inviscid code, and a three dimensional viscous code. Parameter estimation techniques and a particle dynamics calculation were used to assess the accuracy of the laser measurements, which allow a rational basis for comparison of the experimenal and theoretical results. The general agreement of the experimental data with the results from the two inviscid computer codes indicates the usefulness of these calculation procedures for turbomachinery blading. The comparison with the viscous code, while generally reasonable, was not as good as for the inviscid codes.
Comparison of Einstein-Boltzmann solvers for testing general relativity
NASA Astrophysics Data System (ADS)
Bellini, E.; Barreira, A.; Frusciante, N.; Hu, B.; Peirone, S.; Raveri, M.; Zumalacárregui, M.; Avilez-Lopez, A.; Ballardini, M.; Battye, R. A.; Bolliet, B.; Calabrese, E.; Dirian, Y.; Ferreira, P. G.; Finelli, F.; Huang, Z.; Ivanov, M. M.; Lesgourgues, J.; Li, B.; Lima, N. A.; Pace, F.; Paoletti, D.; Sawicki, I.; Silvestri, A.; Skordis, C.; Umiltà, C.; Vernizzi, F.
2018-01-01
We compare Einstein-Boltzmann solvers that include modifications to general relativity and find that, for a wide range of models and parameters, they agree to a high level of precision. We look at three general purpose codes that primarily model general scalar-tensor theories, three codes that model Jordan-Brans-Dicke (JBD) gravity, a code that models f (R ) gravity, a code that models covariant Galileons, a code that models Hořava-Lifschitz gravity, and two codes that model nonlocal models of gravity. Comparing predictions of the angular power spectrum of the cosmic microwave background and the power spectrum of dark matter for a suite of different models, we find agreement at the subpercent level. This means that this suite of Einstein-Boltzmann solvers is now sufficiently accurate for precision constraints on cosmological and gravitational parameters.
The application of CFD for military aircraft design at transonic speeds
NASA Technical Reports Server (NTRS)
Smith, C. W.; Braymen, W. W.; Bhateley, I. C.; Londenberg, W. K.
1989-01-01
Numerous computational fluid dynamics (CFD) codes are available that solve any of several variations of the transonic flow equations from small disturbance to full Navier-Stokes. The design philosophy at General Dynamics Fort Worth Division involves use of all these levels of codes, depending on the stage of configuration development. Throughout this process, drag calculation is a central issue. An overview is provided for several transonic codes and representative test-to-theory comparisons for fighter-type configurations are presented. Correlations are shown for lift, drag, pitching moment, and pressure distributions. The future of applied CFD is also discussed, including the important task of code validation. With the progress being made in code development and the continued evolution in computer hardware, the routine application of these codes for increasingly more complex geometries and flow conditions seems apparent.
ERIC Educational Resources Information Center
Henning, Elizabeth
2012-01-01
From the field of developmental psycholinguistics and from conceptual development theory there is evidence that excessive linguistic "code-switching" in early school education may pose some hazards for the learning of young multilingual children. In this article the author addresses the issue, invoking post-Piagetian and neo-Vygotskian…
1978-07-01
l l) A paper t i t led “Part icle-Fluid Hybrid Codes Applied to Beam- Plasma , Ring -Plasma Instabi l i ties ” was presented at Monterey (see Section V...ic le-Fluid Hybr id Codes Applied to Beam- Plasma , Ring -Plasma Ins tab i l i t ies”. (2) A. Peiravi and C. K. Birdsall , “Self-Heating of id Therma l
ERIC Educational Resources Information Center
Castro, Paloma; Sercu, Lies; Mendez Garcia, Maria del Carmen
2004-01-01
A recent shift has been noticeable in foreign language education theory. Previously, foreign languages were taught as a linguistic code. This then shifted to teaching that code against the sociocultural background of, primarily, one country in which the foreign language is spoken as a national language. More recently, teaching has reflected on…
VizieR Online Data Catalog: Energy levels & transition rates for F-like ions (Si+, 2016)
NASA Astrophysics Data System (ADS)
Si, R.; Li, S.; Guo, X. L.; Chen, Z. B.; Brage, T.; Jonsson, P.; Wang, K.; Yan, J.; Chen, C. Y.; Zou, Y. M.
2017-01-01
For the multiconfiguration Dirac-Hartree-Fock (MCDHF) calculation we use the latest version of the GRASP2K code (Jonsson+ 2013CoPhC.184.2197J), while the many-body perturbation theory (MBPT) calculation is performed using the Flexible Atomic Code (FAC; Gu 2008CaJPh..86..675G). (2 data files).
Error-correcting codes in computer arithmetic.
NASA Technical Reports Server (NTRS)
Massey, J. L.; Garcia, O. N.
1972-01-01
Summary of the most important results so far obtained in the theory of coding for the correction and detection of errors in computer arithmetic. Attempts to satisfy the stringent reliability demands upon the arithmetic unit are considered, and special attention is given to attempts to incorporate redundancy into the numbers themselves which are being processed so that erroneous results can be detected and corrected.
Studies of Planet Formation using a Hybrid N-body + Planetesimal Code
NASA Technical Reports Server (NTRS)
Kenyon, Scott J.; Bromley, Benjamin C.; Salamon, Michael (Technical Monitor)
2005-01-01
The goal of our proposal was to use a hybrid multi-annulus planetesimal/n-body code to examine the planetesimal theory, one of the two main theories of planet formation. We developed this code to follow the evolution of numerous 1 m to 1 km planetesimals as they collide, merge, and grow into full-fledged planets. Our goal was to apply the code to several well-posed, topical problems in planet formation and to derive observational consequences of the models. We planned to construct detailed models to address two fundamental issues: 1) icy planets - models for icy planet formation will demonstrate how the physical properties of debris disks, including the Kuiper Belt in our solar system, depend on initial conditions and input physics; and 2) terrestrial planets - calculations following the evolution of 1-10 km planetesimals into Earth-mass planets and rings of dust will provide a better understanding of how terrestrial planets form and interact with their environment. During the past year, we made progress on each issue. Papers published in 2004 are summarized. Summaries of work to be completed during the first half of 2005 and work planned for the second half of 2005 are included.
Validation of Heat Transfer and Film Cooling Capabilities of the 3-D RANS Code TURBO
NASA Technical Reports Server (NTRS)
Shyam, Vikram; Ameri, Ali; Chen, Jen-Ping
2010-01-01
The capabilities of the 3-D unsteady RANS code TURBO have been extended to include heat transfer and film cooling applications. The results of simulations performed with the modified code are compared to experiment and to theory, where applicable. Wilcox s k-turbulence model has been implemented to close the RANS equations. Two simulations are conducted: (1) flow over a flat plate and (2) flow over an adiabatic flat plate cooled by one hole inclined at 35 to the free stream. For (1) agreement with theory is found to be excellent for heat transfer, represented by local Nusselt number, and quite good for momentum, as represented by the local skin friction coefficient. This report compares the local skin friction coefficients and Nusselt numbers on a flat plate obtained using Wilcox's k-model with the theory of Blasius. The study looks at laminar and turbulent flows over an adiabatic flat plate and over an isothermal flat plate for two different wall temperatures. It is shown that TURBO is able to accurately predict heat transfer on a flat plate. For (2) TURBO shows good qualitative agreement with film cooling experiments performed on a flat plate with one cooling hole. Quantitatively, film effectiveness is under predicted downstream of the hole.
NIFTY - Numerical Information Field Theory. A versatile PYTHON library for signal inference
NASA Astrophysics Data System (ADS)
Selig, M.; Bell, M. R.; Junklewitz, H.; Oppermann, N.; Reinecke, M.; Greiner, M.; Pachajoa, C.; Enßlin, T. A.
2013-06-01
NIFTy (Numerical Information Field Theory) is a software package designed to enable the development of signal inference algorithms that operate regardless of the underlying spatial grid and its resolution. Its object-oriented framework is written in Python, although it accesses libraries written in Cython, C++, and C for efficiency. NIFTy offers a toolkit that abstracts discretized representations of continuous spaces, fields in these spaces, and operators acting on fields into classes. Thereby, the correct normalization of operations on fields is taken care of automatically without concerning the user. This allows for an abstract formulation and programming of inference algorithms, including those derived within information field theory. Thus, NIFTy permits its user to rapidly prototype algorithms in 1D, and then apply the developed code in higher-dimensional settings of real world problems. The set of spaces on which NIFTy operates comprises point sets, n-dimensional regular grids, spherical spaces, their harmonic counterparts, and product spaces constructed as combinations of those. The functionality and diversity of the package is demonstrated by a Wiener filter code example that successfully runs without modification regardless of the space on which the inference problem is defined. NIFTy homepage http://www.mpa-garching.mpg.de/ift/nifty/; Excerpts of this paper are part of the NIFTy source code and documentation.
Transcultural nursing care values, beliefs, and practices of American (USA) Gypsies.
Bodner, A; Leininger, M
1992-01-01
This ethnonursing qualitative investigation was focused on the domain of culture care values, expression and meanings of selected American Gypsies. The purpose of the study was to explicate culture care American Gypsy lifeways in order to help nurses understand this largely unknown culture, and to offer guidelines for providing culturally congruent nursing care. Leininger's theory of Culture Care Diversity and Universality was the appropriate theory to use for this study, along with the ethnonursing research method to generate emic and etic grounded data. Findings substantiated that the world view, ethnohistory, religion (moral code), kinship and cultural values, and generic folk practices were powerful influences of Gypsy lifeways and supported culture congruent nursing care. Ethnohistorical facts strongly buttressed the cultural values, norms, and moral codes for culture specific care practices. Several Gypsy culture specific and dominant care meanings, expressions, and actions were confirmed and made credible from raw data and thematic analysis. They were: 1) protective in-group caring; 2) watching over and guarding against Gadje; 3) facilitating care rituals; 4) respecting Gypsy values; 5) alleviating Gadje harassment; 6) remaining suspicious of outsiders; and 7) dealing with purity and impurity moral codes and rules. Culture specific and congruent care generated from Leininger's theory with the three predicted modes were identified to guide nursing decisions and actions.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
Towards self-correcting quantum memories
NASA Astrophysics Data System (ADS)
Michnicki, Kamil
This thesis presents a model of self-correcting quantum memories where quantum states are encoded using topological stabilizer codes and error correction is done using local measurements and local dynamics. Quantum noise poses a practical barrier to developing quantum memories. This thesis explores two types of models for suppressing noise. One model suppresses thermalizing noise energetically by engineering a Hamiltonian with a high energy barrier between code states. Thermalizing dynamics are modeled phenomenologically as a Markovian quantum master equation with only local generators. The second model suppresses stochastic noise with a cellular automaton that performs error correction using syndrome measurements and a local update rule. Several ways of visualizing and thinking about stabilizer codes are presented in order to design ones that have a high energy barrier: the non-local Ising model, the quasi-particle graph and the theory of welded stabilizer codes. I develop the theory of welded stabilizer codes and use it to construct a code with the highest known energy barrier in 3-d for spin Hamiltonians: the welded solid code. Although the welded solid code is not fully self correcting, it has some self correcting properties. It has an increased memory lifetime for an increased system size up to a temperature dependent maximum. One strategy for increasing the energy barrier is by mediating an interaction with an external system. I prove a no-go theorem for a class of Hamiltonians where the interaction terms are local, of bounded strength and commute with the stabilizer group. Under these conditions the energy barrier can only be increased by a multiplicative constant. I develop cellular automaton to do error correction on a state encoded using the toric code. The numerical evidence indicates that while there is no threshold, the model can extend the memory lifetime significantly. While of less theoretical importance, this could be practical for real implementations of quantum memories. Numerical evidence also suggests that the cellular automaton could function as a decoder with a soft threshold.
Network analysis for the visualization and analysis of qualitative data.
Pokorny, Jennifer J; Norman, Alex; Zanesco, Anthony P; Bauer-Wu, Susan; Sahdra, Baljinder K; Saron, Clifford D
2018-03-01
We present a novel manner in which to visualize the coding of qualitative data that enables representation and analysis of connections between codes using graph theory and network analysis. Network graphs are created from codes applied to a transcript or audio file using the code names and their chronological location. The resulting network is a representation of the coding data that characterizes the interrelations of codes. This approach enables quantification of qualitative codes using network analysis and facilitates examination of associations of network indices with other quantitative variables using common statistical procedures. Here, as a proof of concept, we applied this method to a set of interview transcripts that had been coded in 2 different ways and the resultant network graphs were examined. The creation of network graphs allows researchers an opportunity to view and share their qualitative data in an innovative way that may provide new insights and enhance transparency of the analytical process by which they reach their conclusions. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
SATCOM antenna siting study on a P-3C using the NEC-BSC V3.1
NASA Technical Reports Server (NTRS)
Bensman, D.; Marhefka, R. J.
1990-01-01
The location of a UHF SATCOM antenna on a P-3C aircraft is studied using the NEC-Basic Scattering Code V3.1 (NEC-BSC3). The NEC-BSC3 is a computer code based on the uniform theory of diffraction. The code is first validated for this application using scale model measurements. In general, the comparisons are good except in 10 degree regions near the nose and tail of the aircraft. Patterns for various antenna locations are analyzed to achieve a prescripted performance.
NASA Technical Reports Server (NTRS)
Laughlin, Daniel
2008-01-01
Persistent Immersive Synthetic Environments (PISE) are not just connection points, they are meeting places. They are the new public squares, village centers, malt shops, malls and pubs all rolled into one. They come with a sense of 'thereness" that engages the mind like a real place does. Learning starts as a real code. The code defines "objects." The objects exist in computer space, known as the "grid." The objects and space combine to create a "place." A "world" is created, Before long, the grid and code becomes obscure, and the "world maintains focus.
Interleaved concatenated codes: New perspectives on approaching the Shannon limit
Viterbi, A. J.; Viterbi, A. M.; Sindhushayana, N. T.
1997-01-01
The last few years have witnessed a significant decrease in the gap between the Shannon channel capacity limit and what is practically achievable. Progress has resulted from novel extensions of previously known coding techniques involving interleaved concatenated codes. A considerable body of simulation results is now available, supported by an important but limited theoretical basis. This paper presents a computational technique which further ties simulation results to the known theory and reveals a considerable reduction in the complexity required to approach the Shannon limit. PMID:11038568
Modeling of the EAST ICRF antenna with ICANT Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Qin Chengming; Zhao Yanping; Colas, L.
2007-09-28
A Resonant Double Loop (RDL) antenna for ion-cyclotron range of frequencies (ICRF) on Experimental Advanced Superconducting Tokamak (EAST) is under construction. The new antenna is analyzed using the antenna coupling code ICANT which self-consistently determines the surface currents on all antenna parts. In this work, the modeling of the new ICRF antenna using this code is to assess the near-fields in front of the antenna and analysis its coupling capabilities. Moreover, the antenna reactive radiated power computed by ICANT and shows a good agreement with deduced from Transmission Line (TL) theory.
Modeling of the EAST ICRF antenna with ICANT Code
NASA Astrophysics Data System (ADS)
Qin, Chengming; Zhao, Yanping; Colas, L.; Heuraux, S.
2007-09-01
A Resonant Double Loop (RDL) antenna for ion-cyclotron range of frequencies (ICRF) on Experimental Advanced Superconducting Tokamak (EAST) is under construction. The new antenna is analyzed using the antenna coupling code ICANT which self-consistently determines the surface currents on all antenna parts. In this work, the modeling of the new ICRF antenna using this code is to assess the near-fields in front of the antenna and analysis its coupling capabilities. Moreover, the antenna reactive radiated power computed by ICANT and shows a good agreement with deduced from Transmission Line (TL) theory.
Functional dissociation of stimulus intensity encoding and predictive coding of pain in the insula
Geuter, Stephan; Boll, Sabrina; Eippert, Falk; Büchel, Christian
2017-01-01
The computational principles by which the brain creates a painful experience from nociception are still unknown. Classic theories suggest that cortical regions either reflect stimulus intensity or additive effects of intensity and expectations, respectively. By contrast, predictive coding theories provide a unified framework explaining how perception is shaped by the integration of beliefs about the world with mismatches resulting from the comparison of these beliefs against sensory input. Using functional magnetic resonance imaging during a probabilistic heat pain paradigm, we investigated which computations underlie pain perception. Skin conductance, pupil dilation, and anterior insula responses to cued pain stimuli strictly followed the response patterns hypothesized by the predictive coding model, whereas posterior insula encoded stimulus intensity. This novel functional dissociation of pain processing within the insula together with previously observed alterations in chronic pain offer a novel interpretation of aberrant pain processing as disturbed weighting of predictions and prediction errors. DOI: http://dx.doi.org/10.7554/eLife.24770.001 PMID:28524817
Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction
Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta
2018-01-01
The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research. PMID:29599739
Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.
Watanabe, Eiji; Kitaoka, Akiyoshi; Sakamoto, Kiwako; Yasugi, Masaki; Tanaka, Kenta
2018-01-01
The cerebral cortex predicts visual motion to adapt human behavior to surrounding objects moving in real time. Although the underlying mechanisms are still unknown, predictive coding is one of the leading theories. Predictive coding assumes that the brain's internal models (which are acquired through learning) predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. In the past year, deep neural networks based on predictive coding were reported for a video prediction machine called PredNet. If the theory substantially reproduces the visual information processing of the cerebral cortex, then PredNet can be expected to represent the human visual perception of motion. In this study, PredNet was trained with natural scene videos of the self-motion of the viewer, and the motion prediction ability of the obtained computer model was verified using unlearned videos. We found that the computer model accurately predicted the magnitude and direction of motion of a rotating propeller in unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception. While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion. This research supports the exciting idea that the mechanism assumed by the predictive coding theory is one of basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.
Performance optimization of Qbox and WEST on Intel Knights Landing
NASA Astrophysics Data System (ADS)
Zheng, Huihuo; Knight, Christopher; Galli, Giulia; Govoni, Marco; Gygi, Francois
We present the optimization of electronic structure codes Qbox and WEST targeting the Intel®Xeon Phi™processor, codenamed Knights Landing (KNL). Qbox is an ab-initio molecular dynamics code based on plane wave density functional theory (DFT) and WEST is a post-DFT code for excited state calculations within many-body perturbation theory. Both Qbox and WEST employ highly scalable algorithms which enable accurate large-scale electronic structure calculations on leadership class supercomputer platforms beyond 100,000 cores, such as Mira and Theta at the Argonne Leadership Computing Facility. In this work, features of the KNL architecture (e.g. hierarchical memory) are explored to achieve higher performance in key algorithms of the Qbox and WEST codes and to develop a road-map for further development targeting next-generation computing architectures. In particular, the optimizations of the Qbox and WEST codes on the KNL platform will target efficient large-scale electronic structure calculations of nanostructured materials exhibiting complex structures and prediction of their electronic and thermal properties for use in solar and thermal energy conversion device. This work was supported by MICCoM, as part of Comp. Mats. Sci. Program funded by the U.S. DOE, Office of Sci., BES, MSE Division. This research used resources of the ALCF, which is a DOE Office of Sci. User Facility under Contract DE-AC02-06CH11357.
Momeni, Ali; Rouhi, Kasra; Rajabalipanah, Hamid; Abdolali, Ali
2018-04-18
Inspired by the information theory, a new concept of re-programmable encrypted graphene-based coding metasurfaces was investigated at terahertz frequencies. A channel-coding function was proposed to convolutionally record an arbitrary information message onto unrecognizable but recoverable parity beams generated by a phase-encrypted coding metasurface. A single graphene-based reflective cell with dual-mode biasing voltages was designed to act as "0" and "1" meta-atoms, providing broadband opposite reflection phases. By exploiting graphene tunability, the proposed scheme enabled an unprecedented degree of freedom in the real-time mapping of information messages onto multiple parity beams which could not be damaged, altered, and reverse-engineered. Various encryption types such as mirroring, anomalous reflection, multi-beam generation, and scattering diffusion can be dynamically attained via our multifunctional metasurface. Besides, contrary to conventional time-consuming and optimization-based methods, this paper convincingly offers a fast, straightforward, and efficient design of diffusion metasurfaces of arbitrarily large size. Rigorous full-wave simulations corroborated the results where the phase-encrypted metasurfaces exhibited a polarization-insensitive reflectivity less than -10 dB over a broadband frequency range from 1 THz to 1.7 THz. This work reveals new opportunities for the extension of re-programmable THz-coding metasurfaces and may be of interest for reflection-type security systems, computational imaging, and camouflage technology.
Keogh, Alison; Tully, Mark A; Matthews, James; Hurley, Deirdre A
2015-12-01
Medical Research Council (MRC) guidelines recommend applying theory within complex interventions to explain how behaviour change occurs. Guidelines endorse self-management of chronic low back pain (CLBP) and osteoarthritis (OA), but evidence for its effectiveness is weak. This literature review aimed to determine the use of behaviour change theory and techniques within randomised controlled trials of group-based self-management programmes for chronic musculoskeletal pain, specifically CLBP and OA. A two-phase search strategy of electronic databases was used to identify systematic reviews and studies relevant to this area. Articles were coded for their use of behaviour change theory, and the number of behaviour change techniques (BCTs) was identified using a 93-item taxonomy, Taxonomy (v1). 25 articles of 22 studies met the inclusion criteria, of which only three reported having based their intervention on theory, and all used Social Cognitive Theory. A total of 33 BCTs were coded across all articles with the most commonly identified techniques being 'instruction on how to perform the behaviour', 'demonstration of the behaviour', 'behavioural practice', 'credible source', 'graded tasks' and 'body changes'. Results demonstrate that theoretically driven research within group based self-management programmes for chronic musculoskeletal pain is lacking, or is poorly reported. Future research that follows recommended guidelines regarding the use of theory in study design and reporting is warranted. Copyright © 2015 Elsevier Ltd. All rights reserved.
Nonadiabatic Dynamics for Electrons at Second-Order: Real-Time TDDFT and OSCF2.
Nguyen, Triet S; Parkhill, John
2015-07-14
We develop a new model to simulate nonradiative relaxation and dephasing by combining real-time Hartree-Fock and density functional theory (DFT) with our recent open-systems theory of electronic dynamics. The approach has some key advantages: it has been systematically derived and properly relaxes noninteracting electrons to a Fermi-Dirac distribution. This paper combines the new dissipation theory with an atomistic, all-electron quantum chemistry code and an atom-centered model of the thermal environment. The environment is represented nonempirically and is dependent on molecular structure in a nonlocal way. A production quality, O(N(3)) closed-shell implementation of our theory applicable to realistic molecular systems is presented, including timing information. This scaling implies that the added cost of our nonadiabatic relaxation model, time-dependent open self-consistent field at second order (OSCF2), is computationally inexpensive, relative to adiabatic propagation of real-time time-dependent Hartree-Fock (TDHF) or time-dependent density functional theory (TDDFT). Details of the implementation and numerical algorithm, including factorization and efficiency, are discussed. We demonstrate that OSCF2 approaches the stationary self-consistent field (SCF) ground state when the gap is large relative to k(b)T. The code is used to calculate linear-response spectra including the effects of bath dynamics. Finally, we show how our theory of finite-temperature relaxation can be used to correct ground-state DFT calculations.
Beer, M; Nohria, N
2000-01-01
Today's fast-paced economy demands that businesses change or die. But few companies manage corporate transformations as well as they would like. The brutal fact is that about 70% of all change initiatives fail. In this article, authors Michael Beer and Nitin Nohria describe two archetypes--or theories--of corporate transformation that may help executives crack the code of change. Theory E is change based on economic value: shareholder value is the only legitimate measure of success, and change often involves heavy use of economic incentives, layoffs, downsizing, and restructuring. Theory O is change based on organizational capability: the goal is to build and strengthen corporate culture. Most companies focus purely on one theory or the other, or haphazardly use a mix of both, the authors say. Combining E and O is directionally correct, they contend, but it requires a careful, conscious integration plan. Beer and Nohria present the examples of two companies, Scott Paper and Champion International, that used a purely E or purely O strategy to create change--and met with limited levels of success. They contrast those corporate transformations with that of UK-based retailer ASDA, which has successfully embraced the paradox between the opposing theories of change and integrated E and O. The lesson from ASDA? To thrive and adapt in the new economy, companies must make sure the E and O theories of business change are in sync at their own organizations.
Electromagnetic Dissociation Cross Sections using Weisskopf-Ewing Theory
NASA Technical Reports Server (NTRS)
Adamczyk, Anne M.; Norbury, John W.
2011-01-01
It is important that accurate estimates of crew exposure to radiation are obtained for future long-term space missions. Presently, several space radiation transport codes exist to predict the radiation environment, all of which take as input particle interaction cross sections that describe the nuclear interactions between the particles and the shielding material. The space radiation transport code HZETRN uses the nuclear fragmentation model NUCFRG2 to calculate Electromagnetic Dissociation (EMD) cross sections. Currently, NUCFRG2 employs energy independent branching ratios to calculate these cross sections. Using Weisskopf-Ewing (WE) theory to calculate branching ratios, however, is more advantageous than the method currently employed in NUCFRG2. The WE theory can calculate not only neutron and proton emission, as in the energy independent branching ratio formalism used in NUCFRG2, but also deuteron, triton, helion, and alpha particle emission. These particles can contribute significantly to total exposure estimates. In this work, photonuclear cross sections are calculated using WE theory and the energy independent branching ratios used in NUCFRG2 and then compared to experimental data. It is found that the WE theory gives comparable, but mainly better agreement with data than the energy independent branching ratio. Furthermore, EMD cross sections for single neutron, proton, and alpha particle removal are calculated using WE theory and an energy independent branching ratio used in NUCFRG2 and compared to experimental data.
ERIC Educational Resources Information Center
Namaghi, Seyyed Ali Ostovar; Moghaddam, Mohammad Reza Saboor; Tajzad, Maryam
2014-01-01
The purpose of this study is to explore language teachers' perspectives on Iranian third grade senior high school EFL textbook, which is prescribed by the Ministry of Education. In data collection and analysis, the researchers used theoretical sampling and the coding schemes presented in grounded theory. Final analysis yielded "Negative…
Applying Modern Stage Theory to Mauritania: A Prescription to Encourage Entrepreneurship
2014-12-01
entrepreneurship, stage theory, development, Africa , factor-driven, trade freedom, business freedom 15. NUMBER OF PAGES 77 16. PRICE CODE 17...SOUTH ASIA, SUB-SAHARAN AFRICA ) from the NAVAL POSTGRADUATE SCHOOL December 2014 Author: Jennifer M. Warren Approved by: Robert E...Notes, Coins) .......................................................................... 4 Figure 2. Satellite map of West Africa (from Google Earth
Left-Right Coding of Past and Future in Language: The Mental Timeline during Sentence Processing
ERIC Educational Resources Information Center
Ulrich, Rolf; Maienborn, Claudia
2010-01-01
The metaphoric mapping theory suggests that abstract concepts, like time, are represented in terms of concrete dimensions such as space. This theory receives support from several lines of research ranging from psychophysics to linguistics and cultural studies; especially strong support comes from recent response time studies. These studies have…
NASA Technical Reports Server (NTRS)
Cassenti, B. N.
1983-01-01
The results of a 10-month research and development program for nonlinear structural modeling with advanced time-temperature constitutive relationships are presented. The implementation of the theory in the MARC nonlinear finite element code is discussed, and instructions for the computational application of the theory are provided.
Attachment, Parent-Child Discourse and Theory-of-Mind Development
ERIC Educational Resources Information Center
Ontai, Lenna L.; Thompson, Ross A.
2008-01-01
This study investigated the relations among attachment, mother-child discourse, and theory of mind in a sample of 76 four-year-old children (mean age = 4.48 years; 36 boys). Mother-child conversations about a past event were coded for maternal use of elaborative discourse and mothers' references to mental states. Mothers completed the attachment…
[Introduction to grounded theory].
Wang, Shou-Yu; Windsor, Carol; Yates, Patsy
2012-02-01
Grounded theory, first developed by Glaser and Strauss in the 1960s, was introduced into nursing education as a distinct research methodology in the 1970s. The theory is grounded in a critique of the dominant contemporary approach to social inquiry, which imposed "enduring" theoretical propositions onto study data. Rather than starting from a set theoretical framework, grounded theory relies on researchers distinguishing meaningful constructs from generated data and then identifying an appropriate theory. Grounded theory is thus particularly useful in investigating complex issues and behaviours not previously addressed and concepts and relationships in particular populations or places that are still undeveloped or weakly connected. Grounded theory data analysis processes include open, axial and selective coding levels. The purpose of this article was to explore the grounded theory research process and provide an initial understanding of this methodology.
A novel construction method of QC-LDPC codes based on CRT for optical communications
NASA Astrophysics Data System (ADS)
Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu
2016-05-01
A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.
NASA Technical Reports Server (NTRS)
Bade, W. L.; Yos, J. M.
1975-01-01
The present, third volume of the final report is a programmer's manual for the code. It provides a listing of the FORTRAN 4 source program; a complete glossary of FORTRAN symbols; a discussion of the purpose and method of operation of each subroutine (including mathematical analyses of special algorithms); and a discussion of the operation of the code on IBM/360 and UNIVAC 1108 systems, including required control cards and the overlay structure used to accommodate the code to the limited core size of the 1108. In addition, similar information is provided to document the programming of the NOZFIT code, which is employed to set up nozzle profile curvefits for use in NATA.
Potential flow theory and operation guide for the panel code PMARC
NASA Technical Reports Server (NTRS)
Ashby, Dale L.; Dudley, Michael R.; Iguchi, Steve K.; Browne, Lindsey; Katz, Joseph
1991-01-01
The theoretical basis for PMARC, a low-order potential-flow panel code for modeling complex three-dimensional geometries, is outlined. Several of the advanced features currently included in the code, such as internal flow modeling, a simple jet model, and a time-stepping wake model, are discussed in some detail. The code is written using adjustable size arrays so that it can be easily redimensioned for the size problem being solved and the computer hardware being used. An overview of the program input is presented, with a detailed description of the input available in the appendices. Finally, PMARC results for a generic wing/body configuration are compared with experimental data to demonstrate the accuracy of the code. The input file for this test case is given in the appendices.
The Helicopter Antenna Radiation Prediction Code (HARP)
NASA Technical Reports Server (NTRS)
Klevenow, F. T.; Lynch, B. G.; Newman, E. H.; Rojas, R. G.; Scheick, J. T.; Shamansky, H. T.; Sze, K. Y.
1990-01-01
The first nine months effort in the development of a user oriented computer code, referred to as the HARP code, for analyzing the radiation from helicopter antennas is described. The HARP code uses modern computer graphics to aid in the description and display of the helicopter geometry. At low frequencies the helicopter is modeled by polygonal plates, and the method of moments is used to compute the desired patterns. At high frequencies the helicopter is modeled by a composite ellipsoid and flat plates, and computations are made using the geometrical theory of diffraction. The HARP code will provide a user friendly interface, employing modern computer graphics, to aid the user to describe the helicopter geometry, select the method of computation, construct the desired high or low frequency model, and display the results.
NASA Astrophysics Data System (ADS)
Leukhin, Anatolii N.
2005-08-01
The algebraic solution of a 'complex' problem of synthesis of phase-coded (PC) sequences with the zero level of side lobes of the cyclic autocorrelation function (ACF) is proposed. It is shown that the solution of the synthesis problem is connected with the existence of difference sets for a given code dimension. The problem of estimating the number of possible code combinations for a given code dimension is solved. It is pointed out that the problem of synthesis of PC sequences is related to the fundamental problems of discrete mathematics and, first of all, to a number of combinatorial problems, which can be solved, as the number factorisation problem, by algebraic methods by using the theory of Galois fields and groups.
Black, Nicola; Mullan, Barbara; Sharpe, Louise
2016-09-01
The current aim was to examine the effectiveness of behaviour change techniques (BCTs), theory and other characteristics in increasing the effectiveness of computer-delivered interventions (CDIs) to reduce alcohol consumption. Included were randomised studies with a primary aim of reducing alcohol consumption, which compared self-directed CDIs to assessment-only control groups. CDIs were coded for the use of 42 BCTs from an alcohol-specific taxonomy, the use of theory according to a theory coding scheme and general characteristics such as length of the CDI. Effectiveness of CDIs was assessed using random-effects meta-analysis and the association between the moderators and effect size was assessed using univariate and multivariate meta-regression. Ninety-three CDIs were included in at least one analysis and produced small, significant effects on five outcomes (d+ = 0.07-0.15). Larger effects occurred with some personal contact, provision of normative information or feedback on performance, prompting commitment or goal review, the social norms approach and in samples with more women. Smaller effects occurred when information on the consequences of alcohol consumption was provided. These findings can be used to inform both intervention- and theory-development. Intervention developers should focus on, including specific, effective techniques, rather than many techniques or more-elaborate approaches.
The PLUTO code for astrophysical gasdynamics .
NASA Astrophysics Data System (ADS)
Mignone, A.
Present numerical codes appeal to a consolidated theory based on finite difference and Godunov-type schemes. In this context we have developed a versatile numerical code, PLUTO, suitable for the solution of high-mach number flow in 1, 2 and 3 spatial dimensions and different systems of coordinates. Different hydrodynamic modules and algorithms may be independently selected to properly describe Newtonian, relativistic, MHD, or relativistic MHD fluids. The modular structure exploits a general framework for integrating a system of conservation laws, built on modern Godunov-type shock-capturing schemes. The code is freely distributed under the GNU public license and it is available for download to the astrophysical community at the URL http://plutocode.to.astro.it.
NASA Technical Reports Server (NTRS)
Hartenstein, Richard G., Jr.
1985-01-01
Computer codes have been developed to analyze antennas on aircraft and in the presence of scatterers. The purpose of this study is to use these codes to develop accurate computer models of various aircraft and antenna systems. The antenna systems analyzed are a P-3B L-Band antenna, an A-7E UHF relay pod antenna, and traffic advisory antenna system installed on a Bell Long Ranger helicopter. Computer results are compared to measured ones with good agreement. These codes can be used in the design stage of an antenna system to determine the optimum antenna location and save valuable time and costly flight hours.
Cooperative optimization and their application in LDPC codes
NASA Astrophysics Data System (ADS)
Chen, Ke; Rong, Jian; Zhong, Xiaochun
2008-10-01
Cooperative optimization is a new way for finding global optima of complicated functions of many variables. The proposed algorithm is a class of message passing algorithms and has solid theory foundations. It can achieve good coding gains over the sum-product algorithm for LDPC codes. For (6561, 4096) LDPC codes, the proposed algorithm can achieve 2.0 dB gains over the sum-product algorithm at BER of 4×10-7. The decoding complexity of the proposed algorithm is lower than the sum-product algorithm can do; furthermore, the former can achieve much lower error floor than the latter can do after the Eb / No is higher than 1.8 dB.
NASA Astrophysics Data System (ADS)
Okamoto, Kazuhisa; Nonaka, Chiho
2017-06-01
We construct a new relativistic viscous hydrodynamics code optimized in the Milne coordinates. We split the conservation equations into an ideal part and a viscous part, using the Strang spitting method. In the code a Riemann solver based on the two-shock approximation is utilized for the ideal part and the Piecewise Exact Solution (PES) method is applied for the viscous part. We check the validity of our numerical calculations by comparing analytical solutions, the viscous Bjorken's flow and the Israel-Stewart theory in Gubser flow regime. Using the code, we discuss possible development of the Kelvin-Helmholtz instability in high-energy heavy-ion collisions.
Two Upper Bounds for the Weighted Path Length of Binary Trees. Report No. UIUCDCS-R-73-565.
ERIC Educational Resources Information Center
Pradels, Jean Louis
Rooted binary trees with weighted nodes are structures encountered in many areas, such as coding theory, searching and sorting, information storage and retrieval. The path length is a meaningful quantity which gives indications about the expected time of a search or the length of a code, for example. In this paper, two sharp bounds for the total…
Review of finite fields: Applications to discrete Fourier, transforms and Reed-Solomon coding
NASA Technical Reports Server (NTRS)
Wong, J. S. L.; Truong, T. K.; Benjauthrit, B.; Mulhall, B. D. L.; Reed, I. S.
1977-01-01
An attempt is made to provide a step-by-step approach to the subject of finite fields. Rigorous proofs and highly theoretical materials are avoided. The simple concepts of groups, rings, and fields are discussed and developed more or less heuristically. Examples are used liberally to illustrate the meaning of definitions and theories. Applications include discrete Fourier transforms and Reed-Solomon coding.
A Dual Coding Theoretical Model of Decoding in Reading: Subsuming the LaBerge and Samuels Model
ERIC Educational Resources Information Center
Sadoski, Mark; McTigue, Erin M.; Paivio, Allan
2012-01-01
In this article we present a detailed Dual Coding Theory (DCT) model of decoding. The DCT model reinterprets and subsumes The LaBerge and Samuels (1974) model of the reading process which has served well to account for decoding behaviors and the processes that underlie them. However, the LaBerge and Samuels model has had little to say about…
NASA Technical Reports Server (NTRS)
Srokowski, A. J.
1978-01-01
The problem of obtaining accurate estimates of suction requirements on swept laminar flow control wings was discussed. A fast accurate computer code developed to predict suction requirements by integrating disturbance amplification rates was described. Assumptions and approximations used in the present computer code are examined in light of flow conditions on the swept wing which may limit their validity.
NASA Astrophysics Data System (ADS)
Cui, Tie Jun; Wu, Rui Yuan; Wu, Wei; Shi, Chuan Bo; Li, Yun Bo
2017-10-01
We propose fast and accurate designs to large-scale and low-profile transmission-type anisotropic coding metasurfaces with multiple functions in the millimeter-wave frequencies based on the antenna-array method. The numerical simulation of an anisotropic coding metasurface with the size of 30λ × 30λ by the proposed method takes only 20 min, which however cannot be realized by commercial software due to huge memory usage in personal computers. To inspect the performance of coding metasurfaces in the millimeter-wave band, the working frequency is chosen as 60 GHz. Based on the convolution operations and holographic theory, the proposed multifunctional anisotropic coding metasurface exhibits different effects excited by y-polarized and x-polarized incidences. This study extends the frequency range of coding metasurfaces, filling the gap between microwave and terahertz bands, and implying promising applications in millimeter-wave communication and imaging.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzpatrick, Richard
2007-09-24
Dr. Fitzpatrick has written an MHD code in order to investigate the interaction of tearing modes with flow and external magnetic perturbations, which has been successfully benchmarked against both linear and nonlinear theory and used to investigate error-field penetration in flowing plasmas. The same code was used to investigate the so-called Taylor problem. He employed the University of Chicago's FLASH code to further investigate the Taylor problem, discovering a new aspect of the problem. Dr. Fitzpatrick has written a 2-D Hall MHD code and used it to investigate the collisionless Taylor problem. Dr. Waelbroeck has performed an investigation of themore » scaling of the error-field penetration threshold in collisionless plasmas. Paul Watson and Dr. Fitzpatrick have written a fully-implicit extended-MHD code using the PETSC framework. Five publications have resulted from this grant work.« less
2017-05-04
Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6390--17-9723 Equilibrium Structures and Absorption Spectra for SixOy-nH2O Molecular...Absorption Spectra for SixOy-nH2O Molecular Clusters using Density Functional Theory L. Huang, S.G. Lambrakos, and L. Massa1 Naval Research Laboratory, Code...and time-dependent density functional theory (TD-DFT). The size of the clusters considered is relatively large compared to those considered in
The Development of the Theory and Doctrine of Operational Art in the American Army, 1920-1940
1988-03-22
Ln THE DEVELOPMENT OF THE THEORY AND DOCTRINE 0) OF OPERATIONAL ART IN THE AMERICAN ARMY, 11920-1940 by Major Michael R. Matheny Armor School of...11. TITLE (Include Security Classificetion).. The Dovelcpment of the Theory and Doctrine of Operational Art in the American Army,1920-194 " "a...COSATI CODES I8. SUBJECT TERMS (Continue on reverse if necessary ad •entify by block number) FIELD GROUP SUG-GROUP Operational Art , American Army
Direct measurement of the image displacement instability in a linear induction accelerator
NASA Astrophysics Data System (ADS)
Burris-Mog, T. J.; Ekdahl, C. A.; Moir, D. C.
2017-06-01
The image displacement instability (IDI) has been measured on the 20 MeV Axis I of the dual axis radiographic hydrodynamic test facility and compared to theory. A 0.23 kA electron beam was accelerated across 64 gaps in a low solenoid focusing field, and the position of the beam centroid was measured to 34.3 meters downstream from the cathode. One beam dynamics code was used to model the IDI from first principles, while another code characterized the effects of the resistive wall instability and the beam break-up (BBU) instability. Although the BBU instability was not found to influence the IDI, it appears that the IDI influences the BBU. Because the BBU theory does not fully account for the dependence on beam position for coupling to cavity transverse magnetic modes, the effect of the IDI is missing from the BBU theory. This becomes of particular concern to users of linear induction accelerators operating in or near low magnetic guide fields tunes.
Direct measurement of the image displacement instability in a linear induction accelerator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burris-Mog, T. J.; Ekdahl, C. A.; Moir, D. C.
The image displacement instability (IDI) has been measured on the 20 MeV Axis I of the dual axis radiographic hydrodynamic test facility and compared to theory. A 0.23 kA electron beam was accelerated across 64 gaps in a low solenoid focusing field, and the position of the beam centroid was measured to 34.3 meters downstream from the cathode. One beam dynamics code was used to model the IDI from first principles, while another code characterized the effects of the resistive wall instability and the beam break-up (BBU) instability. Although the BBU instability was not found to influence the IDI, itmore » appears that the IDI influences the BBU. Because the BBU theory does not fully account for the dependence on beam position for coupling to cavity transverse magnetic modes, the effect of the IDI is missing from the BBU theory. Finally, this becomes of particular concern to users of linear induction accelerators operating in or near low magnetic guide fields tunes.« less
Direct measurement of the image displacement instability in a linear induction accelerator
Burris-Mog, T. J.; Ekdahl, C. A.; Moir, D. C.
2017-06-19
The image displacement instability (IDI) has been measured on the 20 MeV Axis I of the dual axis radiographic hydrodynamic test facility and compared to theory. A 0.23 kA electron beam was accelerated across 64 gaps in a low solenoid focusing field, and the position of the beam centroid was measured to 34.3 meters downstream from the cathode. One beam dynamics code was used to model the IDI from first principles, while another code characterized the effects of the resistive wall instability and the beam break-up (BBU) instability. Although the BBU instability was not found to influence the IDI, itmore » appears that the IDI influences the BBU. Because the BBU theory does not fully account for the dependence on beam position for coupling to cavity transverse magnetic modes, the effect of the IDI is missing from the BBU theory. Finally, this becomes of particular concern to users of linear induction accelerators operating in or near low magnetic guide fields tunes.« less
Is Trauma Memory Special? Trauma Narrative Fragmentation in PTSD: Effects of Treatment and Response.
Bedard-Gilligan, Michele; Zoellner, Lori A; Feeny, Norah C
2017-03-01
Seminal theories posit that fragmented trauma memories are critical to posttraumatic stress disorder (PTSD; van der Kolk & Fisler, 1995; Brewin, 2014) and that elaboration of the trauma narrative is necessary for recovery (e.g., Foa, Huppert, & Cahill, 2006). According to fragmentation theories, trauma narrative changes, particularly for those receiving trauma-focused treatment, should accompany symptom reduction. Trauma and control narratives in 77 men and women with chronic PTSD were examined pre- and post-treatment, comparing prolonged exposure (PE) and sertraline. Utilizing self-report, rater coding, and objective coding of narrative content, fragmentation was compared across narrative types (trauma, negative, positive) by treatment modality and response, controlling for potential confounds. Although sensory components increased with PE ( d = 0.23 - 0.44), there were no consistent differences in fragmentation from pre- to post-treatment between PE and sertraline or treatment responders and non-responders. Contrary to theories, changes in fragmentation may not be a crucial mechanism underlying PTSD therapeutic recovery.
Provably secure identity-based identification and signature schemes from code assumptions
Zhao, Yiming
2017-01-01
Code-based cryptography is one of few alternatives supposed to be secure in a post-quantum world. Meanwhile, identity-based identification and signature (IBI/IBS) schemes are two of the most fundamental cryptographic primitives, so several code-based IBI/IBS schemes have been proposed. However, with increasingly profound researches on coding theory, the security reduction and efficiency of such schemes have been invalidated and challenged. In this paper, we construct provably secure IBI/IBS schemes from code assumptions against impersonation under active and concurrent attacks through a provably secure code-based signature technique proposed by Preetha, Vasant and Rangan (PVR signature), and a security enhancement Or-proof technique. We also present the parallel-PVR technique to decrease parameter values while maintaining the standard security level. Compared to other code-based IBI/IBS schemes, our schemes achieve not only preferable public parameter size, private key size, communication cost and signature length due to better parameter choices, but also provably secure. PMID:28809940
Provably secure identity-based identification and signature schemes from code assumptions.
Song, Bo; Zhao, Yiming
2017-01-01
Code-based cryptography is one of few alternatives supposed to be secure in a post-quantum world. Meanwhile, identity-based identification and signature (IBI/IBS) schemes are two of the most fundamental cryptographic primitives, so several code-based IBI/IBS schemes have been proposed. However, with increasingly profound researches on coding theory, the security reduction and efficiency of such schemes have been invalidated and challenged. In this paper, we construct provably secure IBI/IBS schemes from code assumptions against impersonation under active and concurrent attacks through a provably secure code-based signature technique proposed by Preetha, Vasant and Rangan (PVR signature), and a security enhancement Or-proof technique. We also present the parallel-PVR technique to decrease parameter values while maintaining the standard security level. Compared to other code-based IBI/IBS schemes, our schemes achieve not only preferable public parameter size, private key size, communication cost and signature length due to better parameter choices, but also provably secure.
Vollmer Dahlke, Deborah; Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G
2015-03-27
Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen's kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice.
Fair, Kayla; Hong, Y Alicia; Beaudoin, Christopher E; Pulczinski, Jairus; Ory, Marcia G
2015-01-01
Background Thousands of mobile health apps are now available for use on mobile phones for a variety of uses and conditions, including cancer survivorship. Many of these apps appear to deliver health behavior interventions but may fail to consider design considerations based in human computer interface and health behavior change theories. Objective This study is designed to assess the presence of and manner in which health behavior change and health communication theories are applied in mobile phone cancer survivorship apps. Methods The research team selected a set of criteria-based health apps for mobile phones and assessed each app using qualitative coding methods to assess the application of health behavior change and communication theories. Each app was assessed using a coding derived from the taxonomy of 26 health behavior change techniques by Abraham and Michie with a few important changes based on the characteristics of mHealth apps that are specific to information processing and human computer interaction such as control theory and feedback systems. Results A total of 68 mobile phone apps and games built on the iOS and Android platforms were coded, with 65 being unique. Using a Cohen’s kappa analysis statistic, the inter-rater reliability for the iOS apps was 86.1 (P<.001) and for the Android apps, 77.4 (P<.001). For the most part, the scores for inclusion of theory-based health behavior change characteristics in the iOS platform cancer survivorship apps were consistently higher than those of the Android platform apps. For personalization and tailoring, 67% of the iOS apps (24/36) had these elements as compared to 38% of the Android apps (12/32). In the area of prompting for intention formation, 67% of the iOS apps (34/36) indicated these elements as compared to 16% (5/32) of the Android apps. Conclusions Mobile apps are rapidly emerging as a way to deliver health behavior change interventions that can be tailored or personalized for individuals. As these apps and games continue to evolve and include interactive and adaptive sensors and other forms of dynamic feedback, their content and interventional elements need to be grounded in human computer interface design and health behavior and communication theory and practice. PMID:25830810
Laser program annual report, 1979
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, L.W.; Strack, J.R.
1980-03-01
This volume contains four sections that covers the areas of target design, target fabrication, diagnostics, and experiments. Section 3 reports on target design activities, plasma theory and simulation, code development, and atomic theory. Section 4 presents the accomplishments of the target fabrication group, and Section 5 presents results of diagnostic developments and applications for the year. The results of laser-target experiments are presented. (MOW)
Graphic Novels as Great Books: A Grounded Theory Study of Faculty Teaching Graphic Novels
ERIC Educational Resources Information Center
Evans-Boniecki, Jeannie
2013-01-01
This Glaserian grounded theory study, through conceptual coding of interviews and course syllabi, aimed at exploring the motivations and aspirations university professors had when they offered courses dedicated to the study of graphic novels. As a result, the emergence of the graphic novel as a vital literary influence in 21st-century academia was…
ERIC Educational Resources Information Center
Vigliocco, Gabriella; Kousta, Stavroula; Vinson, David; Andrews, Mark; Del Campo, Elena
2013-01-01
In Kousta, Vigliocco, Vinson, Andrews, and Del Campo (2011), we presented an embodied theory of semantic representation, which crucially included abstract concepts as internally embodied via affective states. Paivio (2013) took issue with our treatment of dual coding theory, our reliance on data from lexical decision, and our theoretical proposal.…
Using Combinatorica/Mathematica for Student Projects in Random Graph Theory
ERIC Educational Resources Information Center
Pfaff, Thomas J.; Zaret, Michele
2006-01-01
We give an example of a student project that experimentally explores a topic in random graph theory. We use the "Combinatorica" package in "Mathematica" to estimate the minimum number of edges needed in a random graph to have a 50 percent chance that the graph is connected. We provide the "Mathematica" code and compare it to the known theoretical…
ERIC Educational Resources Information Center
Einsiedler, Wolfgang
1996-01-01
Asks whether theories of knowledge representation provide a basis for the development of theories of knowledge structuring in instruction. Discusses codes of knowledge, surface versus deep structures, semantic networks, and multiple memory systems. Reviews research on teaching, external representation of cognitive structures, hierarchical…
NASA Astrophysics Data System (ADS)
Semenov, Alexander; Babikov, Dmitri
2013-11-01
We formulated the mixed quantum/classical theory for rotationally and vibrationally inelastic scattering process in the diatomic molecule + atom system. Two versions of theory are presented, first in the space-fixed and second in the body-fixed reference frame. First version is easy to derive and the resultant equations of motion are transparent, but the state-to-state transition matrix is complex-valued and dense. Such calculations may be computationally demanding for heavier molecules and/or higher temperatures, when the number of accessible channels becomes large. In contrast, the second version of theory requires some tedious derivations and the final equations of motion are rather complicated (not particularly intuitive). However, the state-to-state transitions are driven by real-valued sparse matrixes of much smaller size. Thus, this formulation is the method of choice from the computational point of view, while the space-fixed formulation can serve as a test of the body-fixed equations of motion, and the code. Rigorous numerical tests were carried out for a model system to ensure that all equations, matrixes, and computer codes in both formulations are correct.
Feynman rules for the Standard Model Effective Field Theory in R ξ -gauges
NASA Astrophysics Data System (ADS)
Dedes, A.; Materkowska, W.; Paraskevas, M.; Rosiek, J.; Suxho, K.
2017-06-01
We assume that New Physics effects are parametrized within the Standard Model Effective Field Theory (SMEFT) written in a complete basis of gauge invariant operators up to dimension 6, commonly referred to as "Warsaw basis". We discuss all steps necessary to obtain a consistent transition to the spontaneously broken theory and several other important aspects, including the BRST-invariance of the SMEFT action for linear R ξ -gauges. The final theory is expressed in a basis characterized by SM-like propagators for all physical and unphysical fields. The effect of the non-renormalizable operators appears explicitly in triple or higher multiplicity vertices. In this mass basis we derive the complete set of Feynman rules, without resorting to any simplifying assumptions such as baryon-, lepton-number or CP conservation. As it turns out, for most SMEFT vertices the expressions are reasonably short, with a noticeable exception of those involving 4, 5 and 6 gluons. We have also supplemented our set of Feynman rules, given in an appendix here, with a publicly available Mathematica code working with the FeynRules package and producing output which can be integrated with other symbolic algebra or numerical codes for automatic SMEFT amplitude calculations.
COLAcode: COmoving Lagrangian Acceleration code
NASA Astrophysics Data System (ADS)
Tassev, Svetlin V.
2016-02-01
COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.
Zhang, Zhi-Hai; Gao, Ling-Xiao; Guo, Yuan-Jun; Wang, Wei; Mo, Xiang-Xia
2012-12-01
The template selection is essential in the application of digital micromirror spectrometer. The best theoretical coding H-matrix is not widely used due to acyclic, complex coding and difficult achievement. The noise ratio of best practical S-matrix for improvement is slightly inferior to matrix H. So we designed a new type complementary S-matrix. Through studying its noise improvement theory, the algorithm is proved to have the advantages of both H-matrix and S-matrix. The experiments proved that the SNR can be increased 2.05 times than S-template.
Anderson, Michael L; Chemero, Tony
2013-06-01
Clark appears to be moving toward epistemic internalism, which he once rightly rejected. This results from a double over-interpretation of predictive coding's significance. First, Clark argues that predictive coding offers a Grand Unified Theory (GUT) of brain function. Second, he over-reads its epistemic import, perhaps even conflating causal and epistemic mediators. We argue instead for a plurality of neurofunctional principles.
NASA Technical Reports Server (NTRS)
Rice, R. F.
1974-01-01
End-to-end system considerations involving channel coding and data compression are reported which could drastically improve the efficiency in communicating pictorial information from future planetary spacecraft. In addition to presenting new and potentially significant system considerations, this report attempts to fill a need for a comprehensive tutorial which makes much of this very subject accessible to readers whose disciplines lie outside of communication theory.
ERIC Educational Resources Information Center
Rice, Bart F.; Wilde, Carroll O.
It is noted that with the prominence of computers in today's technological society, digital communication systems have become widely used in a variety of applications. Some of the problems that arise in digital communications systems are described. This unit presents the problem of correcting errors in such systems. Error correcting codes are…
Wireless Network Security Using Randomness
2012-06-19
370/412 6/2007 Soliman .......................... 380/44 9/2004 Miyake et al ................. 7041201 6/2006 Mauro...8,204,224 B2 Jun .19,2012 (45) Date of Patent: UfHER PUBLICATIONS Shannon, C.E., "Communication Theory of Secrecy Systems," Bell System Technical...MA, Jun . 27, 2001, 14 pages. "Hamming code," Wikipedia page, available at http://en.wikipedia. org!wiki!Hamming_code, printed Sep. 21,2010,7 pages
Governing sexual behaviour through humanitarian codes of conduct.
Matti, Stephanie
2015-10-01
Since 2001, there has been a growing consensus that sexual exploitation and abuse of intended beneficiaries by humanitarian workers is a real and widespread problem that requires governance. Codes of conduct have been promoted as a key mechanism for governing the sexual behaviour of humanitarian workers and, ultimately, preventing sexual exploitation and abuse (PSEA). This article presents a systematic study of PSEA codes of conduct adopted by humanitarian non-governmental organisations (NGOs) and how they govern the sexual behaviour of humanitarian workers. It draws on Foucault's analytics of governance and speech act theory to examine the findings of a survey of references to codes of conduct made on the websites of 100 humanitarian NGOs, and to analyse some features of the organisation-specific PSEA codes identified. © 2015 The Author(s). Disasters © Overseas Development Institute, 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fried, L.E.
1994-06-24
CHEETAH is an effort to bring the TIGER thermochemical code into the 1990s. A wide variety of improvements have been made in Version 1.0, and a host of others will be implemented in the future. In CHEETAH 1.0 I have improved the robustness and ease of use of TIGER. All of TIGER`s solvers have been replaced by new algorithms. I find that CHEETAH solves a wider variety of problems with no user intervention (e.g. no guesses for the C-J state) than TIGER did. CHEETAH has been made simpler to use than TIGER; typical use of the code occurs with themore » new standard run command. I hope that CHEETAH makes the use of thermochemical codes more attractive to practical explosive formulators. In the future I plan to improve the underlying science in CHEETAH. More accurate equations of state will be used in the gas and the condensed phase. A kinetics capability will be added to the code that will predict reaction zone thickness. CHEETAH is currently a numerical implementation of C-J theory. It will,become an implementation of ZND theory. Further ease of use features will eventually be added; an automatic formulator that adjusts concentrations to match desired properties is planned.« less
Verification of the ideal magnetohydrodynamic response at rational surfaces in the VMEC code
Lazerson, Samuel A.; Loizu, Joaquim; Hirshman, Steven; ...
2016-01-13
The VMEC nonlinear ideal MHD equilibrium code [S. P. Hirshman and J. C. Whitson, Phys. Fluids 26, 3553 (1983)] is compared against analytic linear ideal MHD theory in a screw-pinch-like configuration. The focus of such analysis is to verify the ideal MHD response at magnetic surfaces which possess magnetic transform (ι) which is resonant with spectral values of the perturbed boundary harmonics. A large aspect ratio circular cross section zero-beta equilibrium is considered. This equilibrium possess a rational surface with safety factor q = 2 at a normalized flux value of 0.5. A small resonant boundary perturbation is introduced, excitingmore » a response at the resonant rational surface. The code is found to capture the plasma response as predicted by a newly developed analytic theory that ensures the existence of nested flux surfaces by allowing for a jump in rotational transform (ι=1/q). The VMEC code satisfactorily reproduces these theoretical results without the necessity of an explicit transform discontinuity (Δι) at the rational surface. It is found that the response across the rational surfaces depends upon both radial grid resolution and local shear (dι/dΦ, where ι is the rotational transform and Φ the enclosed toroidal flux). Calculations of an implicit Δι suggest that it does not arise due to numerical artifacts (attributed to radial finite differences in VMEC) or existence conditions for flux surfaces as predicted by linear theory (minimum values of Δι). Scans of the rotational transform profile indicate that for experimentally relevant levels of transform shear the response becomes increasing localised. Furthermore, careful examination of a large experimental tokamak equilibrium, with applied resonant fields, indicates that this shielding response is present, suggesting the phenomena is not limited to this verification exercise.« less
A Theory of False Cognitive Expectancies in Airline Pilots
NASA Astrophysics Data System (ADS)
Cortes, Antonio I.
The Theory of False Cognitive Expectancies was developed by studying high reliability flight operations. Airline pilots depend extensively on cognitive expectancies to perceive, understand, and predict actions and events. Out of 1,363 incident reports submitted by airline pilots to the National Aeronautics and Space Administration Aviation Safety Reporting System over a year's time, 110 reports were found to contain evidence of 127 false cognitive expectancies in pilots. A comprehensive taxonomy was developed with six categories of interest. The dataset of 127 false expectancies was used to initially code tentative taxon values for each category. Intermediate coding through constant comparative analysis completed the taxonomy. The taxonomy was used for the advanced coding of chronological context-dependent visualizations of expectancy factors, known as strands, which depict the major factors in the creation and propagation of each expectancy. Strands were mapped into common networks to detect highly represented expectancy processes. Theoretical integration established 11 sources of false expectancies, the most common expectancy errors, and those conspicuous factors worthy of future study. The most prevalent source of false cognitive expectancies within the dataset was determined to be unconscious individual modeling based on past events. Integrative analyses also revealed relationships between expectancies and flight deck automation, unresolved discrepancies, and levels of situation awareness. Particularly noteworthy were the findings that false expectancies can combine in three possible permutations to diminish situation awareness and examples of how false expectancies can be unwittingly transmitted from one person to another. The theory resulting from this research can enhance the error coding process used during aircraft line oriented safety audits, lays the foundation for developing expectancy management training programs, and will allow researchers to proffer hypotheses for human testing using flight simulators.
Effects of visual and verbal interference tasks on olfactory memory: the role of task complexity.
Annett, J M; Leslie, J C
1996-08-01
Recent studies have demonstrated that visual and verbal suppression tasks interfere with olfactory memory in a manner which is partially consistent with a dual coding interpretation. However, it has been suggested that total task complexity rather than modality specificity of the suppression tasks might account for the observed pattern of results. This study addressed the issue of whether or not the level of difficulty and complexity of suppression tasks could explain the apparent modality effects noted in earlier experiments. A total of 608 participants were each allocated to one of 19 experimental conditions involving interference tasks which varied suppression type (visual or verbal), nature of complexity (single, double or mixed) and level of difficulty (easy, optimal or difficult) and presented with 13 target odours. Either recognition of the odours or free recall of the odour names was tested on one occasion, either within 15 minutes of presentation or one week later. Both recognition and recall performance showed an overall effect for suppression nature, suppression level and time of testing with no effect for suppression type. The results lend only limited support to Paivio's (1986) dual coding theory, but have a number of characteristics which suggest that an adequate account of olfactory memory may be broadly similar to current theories of face and object recognition. All of these phenomena might be dealt with by an appropriately modified version of dual coding theory.
Adaptive Core Simulation Employing Discrete Inverse Theory - Part II: Numerical Experiments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdel-Khalik, Hany S.; Turinsky, Paul J.
2005-07-15
Use of adaptive simulation is intended to improve the fidelity and robustness of important core attribute predictions such as core power distribution, thermal margins, and core reactivity. Adaptive simulation utilizes a selected set of past and current reactor measurements of reactor observables, i.e., in-core instrumentation readings, to adapt the simulation in a meaningful way. The companion paper, ''Adaptive Core Simulation Employing Discrete Inverse Theory - Part I: Theory,'' describes in detail the theoretical background of the proposed adaptive techniques. This paper, Part II, demonstrates several computational experiments conducted to assess the fidelity and robustness of the proposed techniques. The intentmore » is to check the ability of the adapted core simulator model to predict future core observables that are not included in the adaption or core observables that are recorded at core conditions that differ from those at which adaption is completed. Also, this paper demonstrates successful utilization of an efficient sensitivity analysis approach to calculate the sensitivity information required to perform the adaption for millions of input core parameters. Finally, this paper illustrates a useful application for adaptive simulation - reducing the inconsistencies between two different core simulator code systems, where the multitudes of input data to one code are adjusted to enhance the agreement between both codes for important core attributes, i.e., core reactivity and power distribution. Also demonstrated is the robustness of such an application.« less
Improvement of gross theory of beta-decay for application to nuclear data
NASA Astrophysics Data System (ADS)
Koura, Hiroyuki; Yoshida, Tadashi; Tachibana, Takahiro; Chiba, Satoshi
2017-09-01
A theoretical study of β decay and delayed neutron has been carried out with a global β-decay model, the gross theory. The gross theory is based on a consideration of the sum rule of the β-strength function, and gives reasonable results of β-decay rates and delayed neutron in the entire nuclear mass region. In a fissioning nucleus, neutrons are produced by β decay of neutron-rich fission fragments from actinides known as delayed neutrons. The average number of delayed neutrons is estimated based on the sum of the β-delayed neutron-emission probabilities multiplied by the cumulative fission yield for each nucleus. Such a behavior is important to manipulate nuclear reactors, and when we adopt some new high-burn-up reactors, properties of minor actinides will play an important roll in the system, but these data have not been sufficient. We re-analyze and improve the gross theory. For example, we considered the parity of neutrons and protons at the Fermi surface, and treat a suppression for the allowed transitions in the framework of the gross theory. By using the improved gross theory, underestimated half-lives in the neutron-rich indium isotopes and neighboring region increase, and consequently follow experimental trend. The ability of reproduction (and also prediction) of the β-decay rates, delayed-neutron emission probabilities is discussed. With this work, we have described the development of a programming code of the gross theory of β-decay including the improved parts. After preparation finished, this code can be released for the nuclear data community.
Huang, Xuan-Yi; Yen, Wen-Jiuan; Liu, Shwu-Jiuan; Lin, Chouh-Jiuan
2008-03-01
The aim was to develop a practice theory that can be used to guide the direction of community nursing practice to help clients with schizophrenia and those who care for them. Substantive grounded theory was developed through use of grounded theory method of Strauss and Corbin. Two groups of participants in Taiwan were selected using theoretical sampling: one group consisted of community mental health nurses and the other group was clients with schizophrenia and those who cared for them. The number of participants in each group was determined by theoretical saturation. Semi-structured one-to-one in-depth interviews and unstructured non-participant observation were utilized for data collection. Data analysis involved three stages: open, axial and selective coding. During the process of coding and analysis, both inductive and deductive thinking were utilized and the constant comparative analysis process continued until data saturation occurred. To establish trustworthiness, the four criteria of credibility, transferability, dependability and confirmability were followed along with field trial, audit trial, member check and peer debriefing for reliability and validity. A substantive grounded theory, the role of community mental health nurses caring for people with schizophrenia in Taiwan, was developed through utilization of grounded theory method of Strauss and Corbin. In this paper, results and discussion focus on causal conditions, context, intervening conditions, consequences and phenomenon. The theory is the first to contribute knowledge about the field of mental health home visiting services in Taiwan to provide guidance for the delivery of quality care to assist people in the community with schizophrenia and their carers.
On entanglement-assisted quantum codes achieving the entanglement-assisted Griesmer bound
NASA Astrophysics Data System (ADS)
Li, Ruihu; Li, Xueliang; Guo, Luobin
2015-12-01
The theory of entanglement-assisted quantum error-correcting codes (EAQECCs) is a generalization of the standard stabilizer formalism. Any quaternary (or binary) linear code can be used to construct EAQECCs under the entanglement-assisted (EA) formalism. We derive an EA-Griesmer bound for linear EAQECCs, which is a quantum analog of the Griesmer bound for classical codes. This EA-Griesmer bound is tighter than known bounds for EAQECCs in the literature. For a given quaternary linear code {C}, we show that the parameters of the EAQECC that EA-stabilized by the dual of {C} can be determined by a zero radical quaternary code induced from {C}, and a necessary condition under which a linear EAQECC may achieve the EA-Griesmer bound is also presented. We construct four families of optimal EAQECCs and then show the necessary condition for existence of EAQECCs is also sufficient for some low-dimensional linear EAQECCs. The four families of optimal EAQECCs are degenerate codes and go beyond earlier constructions. What is more, except four codes, our [[n,k,d_{ea};c
Webber, C J
2001-05-01
This article shows analytically that single-cell learning rules that give rise to oriented and localized receptive fields, when their synaptic weights are randomly and independently initialized according to a plausible assumption of zero prior information, will generate visual codes that are invariant under two-dimensional translations, rotations, and scale magnifications, provided that the statistics of their training images are sufficiently invariant under these transformations. Such codes span different image locations, orientations, and size scales with equal economy. Thus, single-cell rules could account for the spatial scaling property of the cortical simple-cell code. This prediction is tested computationally by training with natural scenes; it is demonstrated that a single-cell learning rule can give rise to simple-cell receptive fields spanning the full range of orientations, image locations, and spatial frequencies (except at the extreme high and low frequencies at which the scale invariance of the statistics of digitally sampled images must ultimately break down, because of the image boundary and the finite pixel resolution). Thus, no constraint on completeness, or any other coupling between cells, is necessary to induce the visual code to span wide ranges of locations, orientations, and size scales. This prediction is made using the theory of spontaneous symmetry breaking, which we have previously shown can also explain the data-driven self-organization of a wide variety of transformation invariances in neurons' responses, such as the translation invariance of complex cell response.
Circular codes revisited: a statistical approach.
Gonzalez, D L; Giannerini, S; Rosa, R
2011-04-21
In 1996 Arquès and Michel [1996. A complementary circular code in the protein coding genes. J. Theor. Biol. 182, 45-58] discovered the existence of a common circular code in eukaryote and prokaryote genomes. Since then, circular code theory has provoked great interest and underwent a rapid development. In this paper we discuss some theoretical issues related to the synchronization properties of coding sequences and circular codes with particular emphasis on the problem of retrieval and maintenance of the reading frame. Motivated by the theoretical discussion, we adopt a rigorous statistical approach in order to try to answer different questions. First, we investigate the covering capability of the whole class of 216 self-complementary, C(3) maximal codes with respect to a large set of coding sequences. The results indicate that, on average, the code proposed by Arquès and Michel has the best covering capability but, still, there exists a great variability among sequences. Second, we focus on such code and explore the role played by the proportion of the bases by means of a hierarchy of permutation tests. The results show the existence of a sort of optimization mechanism such that coding sequences are tailored as to maximize or minimize the coverage of circular codes on specific reading frames. Such optimization clearly relates the function of circular codes with reading frame synchronization. Copyright © 2011 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Costello, Daniel J., Jr.; Courturier, Servanne; Levy, Yannick; Mills, Diane G.; Perez, Lance C.; Wang, Fu-Quan
1993-01-01
In his seminal 1948 paper 'The Mathematical Theory of Communication,' Claude E. Shannon derived the 'channel coding theorem' which has an explicit upper bound, called the channel capacity, on the rate at which 'information' could be transmitted reliably on a given communication channel. Shannon's result was an existence theorem and did not give specific codes to achieve the bound. Some skeptics have claimed that the dramatic performance improvements predicted by Shannon are not achievable in practice. The advances made in the area of coded modulation in the past decade have made communications engineers optimistic about the possibility of achieving or at least coming close to channel capacity. Here we consider the possibility in the light of current research results.
Okamoto, Kazuhisa; Nonaka, Chiho
2017-06-09
Here, we construct a new relativistic viscous hydrodynamics code optimized in the Milne coordinates. We also split the conservation equations into an ideal part and a viscous part, using the Strang spitting method. In the code a Riemann solver based on the two-shock approximation is utilized for the ideal part and the Piecewise Exact Solution (PES) method is applied for the viscous part. Furthemore, we check the validity of our numerical calculations by comparing analytical solutions, the viscous Bjorken’s flow and the Israel–Stewart theory in Gubser flow regime. Using the code, we discuss possible development of the Kelvin–Helmholtz instability inmore » high-energy heavy-ion collisions.« less
Experimental Validation of an Ion Beam Optics Code with a Visualized Ion Thruster
NASA Astrophysics Data System (ADS)
Nakayama, Yoshinori; Nakano, Masakatsu
For validation of an ion beam optics code, the behavior of ion beam optics was experimentally observed and evaluated with a two-dimensional visualized ion thruster (VIT). Since the observed beam focus positions, sheath positions and measured ion beam currents were in good agreement with the numerical results, it was confirmed that the numerical model of this code was appropriated. In addition, it was also confirmed that the beam focus position was moved on center axis of grid hole according to the applied grid potentials, which differs from conventional understanding/assumption. The VIT operations may be useful not only for the validation of ion beam optics codes but also for the fundamental and intuitive understanding of the Child Law Sheath theory.
In the Rearview Mirror: Social Skill Development in Deaf Youth, 1990-2015.
Cawthon, Stephanie W; Fink, Bentley; Schoffstall, Sarah; Wendel, Erica
2018-01-01
Social skills are a vehicle by which individuals negotiate important relationships. The present article presents historical data on how social skills in deaf students were conceptualized and studied empirically during the period 1990-2015. Using a structured literature review approach, the researchers coded 266 articles for theoretical frameworks used and constructs studied. The vast majority of articles did not explicitly align with a specific theoretical framework. Of the 37 that did, most focused on socioemotional and cognitive frameworks, while a minority drew from frameworks focusing on attitudes, developmental theories, or ecological systems theory. In addition, 315 social-skill constructs were coded across the data set; the majority focused on socioemotional functioning. Trends in findings across the past quarter century and implications for research and practice are examined.
Locating relationship and communication issues among stressors associated with breast cancer.
Weber, Kirsten M; Solomon, Denise Haunani
2008-11-01
This article clarifies how the social contexts in which breast cancer survivors live can contribute to the stress they experience because of the disease. Guided by Solomon and Knobloch's (2004) relational turbulence model and Petronio's (2002) communication privacy management theory, this study explores personal relationship and communication boundary issues within stressors that are associated with the diagnosis, treatment, and early survivorship of breast cancer. A qualitative analysis of discourse posted on breast cancer discussion boards and weblogs using the constant comparative method and open-coding techniques revealed 12 sources of stress. Using axial coding methods and probing these topics for underlying relationship and communication issues yielded 5 themes. The discussion highlights the implications of the findings for the theories that guided this investigation and for breast cancer survivorship more generally.
Automated generation of lattice QCD Feynman rules
NASA Astrophysics Data System (ADS)
Hart, A.; von Hippel, G. M.; Horgan, R. R.; Müller, E. H.
2009-12-01
The derivation of the Feynman rules for lattice perturbation theory from actions and operators is complicated, especially for highly improved actions such as HISQ. This task is, however, both important and particularly suitable for automation. We describe a suite of software to generate and evaluate Feynman rules for a wide range of lattice field theories with gluons and (relativistic and/or heavy) quarks. Our programs are capable of dealing with actions as complicated as (m)NRQCD and HISQ. Automated differentiation methods are used to calculate also the derivatives of Feynman diagrams. Program summaryProgram title: HiPPY, HPsrc Catalogue identifier: AEDX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPLv2 (see Additional comments below) No. of lines in distributed program, including test data, etc.: 513 426 No. of bytes in distributed program, including test data, etc.: 4 893 707 Distribution format: tar.gz Programming language: Python, Fortran95 Computer: HiPPy: Single-processor workstations. HPsrc: Single-processor workstations and MPI-enabled multi-processor systems Operating system: HiPPy: Any for which Python v2.5.x is available. HPsrc: Any for which a standards-compliant Fortran95 compiler is available Has the code been vectorised or parallelised?: Yes RAM: Problem specific, typically less than 1 GB for either code Classification: 4.4, 11.5 Nature of problem: Derivation and use of perturbative Feynman rules for complicated lattice QCD actions. Solution method: An automated expansion method implemented in Python (HiPPy) and code to use expansions to generate Feynman rules in Fortran95 (HPsrc). Restrictions: No general restrictions. Specific restrictions are discussed in the text. Additional comments: The HiPPy and HPsrc codes are released under the second version of the GNU General Public Licence (GPL v2). Therefore anyone is free to use or modify the code for their own calculations. As part of the licensing, we ask that any publications including results from the use of this code or of modifications of it cite Refs. [1,2] as well as this paper. Finally, we also ask that details of these publications, as well as of any bugs or required or useful improvements of this core code, would be communicated to us. Running time: Very problem specific, depending on the complexity of the Feynman rules and the number of integration points. Typically between a few minutes and several weeks. The installation tests provided with the program code take only a few seconds to run. References:A. Hart, G.M. von Hippel, R.R. Horgan, L.C. Storoni, Automatically generating Feynman rules for improved lattice eld theories, J. Comput. Phys. 209 (2005) 340-353, doi:10.1016/j.jcp.2005.03.010, arXiv:hep-lat/0411026. M. Lüscher, P. Weisz, Efficient Numerical Techniques for Perturbative Lattice Gauge Theory Computations, Nucl. Phys. B 266 (1986) 309, doi:10.1016/0550-3213(86)90094-5.
Argument structure and the representation of abstract semantics.
Rodríguez-Ferreiro, Javier; Andreu, Llorenç; Sanz-Torrent, Mònica
2014-01-01
According to the dual coding theory, differences in the ease of retrieval between concrete and abstract words are related to the exclusive dependence of abstract semantics on linguistic information. Argument structure can be considered a measure of the complexity of the linguistic contexts that accompany a verb. If the retrieval of abstract verbs relies more on the linguistic codes they are associated to, we could expect a larger effect of argument structure for the processing of abstract verbs. In this study, sets of length- and frequency-matched verbs including 40 intransitive verbs, 40 transitive verbs taking simple complements, and 40 transitive verbs taking sentential complements were presented in separate lexical and grammatical decision tasks. Half of the verbs were concrete and half were abstract. Similar results were obtained in the two tasks, with significant effects of imageability and transitivity. However, the interaction between these two variables was not significant. These results conflict with hypotheses assuming a stronger reliance of abstract semantics on linguistic codes. In contrast, our data are in line with theories that link the ease of retrieval with availability and robustness of semantic information.
Fidelity of the Integrated Force Method Solution
NASA Technical Reports Server (NTRS)
Hopkins, Dale; Halford, Gary; Coroneos, Rula; Patnaik, Surya
2002-01-01
The theory of strain compatibility of the solid mechanics discipline was incomplete since St. Venant's 'strain formulation' in 1876. We have addressed the compatibility condition both in the continuum and the discrete system. This has lead to the formulation of the Integrated Force Method. A dual Integrated Force Method with displacement as the primal variable has also been formulated. A modest finite element code (IFM/Analyzers) based on the IFM theory has been developed. For a set of standard test problems the IFM results were compared with the stiffness method solutions and the MSC/Nastran code. For the problems IFM outperformed the existing methods. Superior IFM performance is attributed to simultaneous compliance of equilibrium equation and compatibility condition. MSC/Nastran organization expressed reluctance to accept the high fidelity IFM solutions. This report discusses the solutions to the examples. No inaccuracy was detected in the IFM solutions. A stiffness method code with a small programming effort can be improved to reap the many IFM benefits when implemented with the IFMD elements. Dr. Halford conducted a peer-review on the Integrated Force Method. Reviewers' response is included.
GW/Bethe-Salpeter calculations for charged and model systems from real-space DFT
NASA Astrophysics Data System (ADS)
Strubbe, David A.
GW and Bethe-Salpeter (GW/BSE) calculations use mean-field input from density-functional theory (DFT) calculations to compute excited states of a condensed-matter system. Many parts of a GW/BSE calculation are efficiently performed in a plane-wave basis, and extensive effort has gone into optimizing and parallelizing plane-wave GW/BSE codes for large-scale computations. Most straightforwardly, plane-wave DFT can be used as a starting point, but real-space DFT is also an attractive starting point: it is systematically convergeable like plane waves, can take advantage of efficient domain parallelization for large systems, and is well suited physically for finite and especially charged systems. The flexibility of a real-space grid also allows convenient calculations on non-atomic model systems. I will discuss the interfacing of a real-space (TD)DFT code (Octopus, www.tddft.org/programs/octopus) with a plane-wave GW/BSE code (BerkeleyGW, www.berkeleygw.org), consider performance issues and accuracy, and present some applications to simple and paradigmatic systems that illuminate fundamental properties of these approximations in many-body perturbation theory.
Novice to expert practice via postprofessional athletic training education: a grounded theory.
Neibert, Peter J
2009-01-01
To discover the theoretic constructs that confirm, disconfirm, or extend the principles and their applications appropriate for National Athletic Trainers' Association (NATA)-accredited postprofessional athletic training education programs. Interviews at the 2003 NATA Annual Meeting & Clinical Symposia. Qualitative study using grounded theory procedures. Thirteen interviews were conducted with postprofessional graduates. Participants were purposefully selected based on theoretic sampling and availability. The transcribed interviews were analyzed using open coding, axial coding, and selective coding procedures. Member checks, reflective journaling, and triangulation were used to ensure trustworthiness. The participants' comments confirmed and extended the current principles of postprofessional athletic training education programs and offered additional suggestions for more effective practical applications. The emergence of this central category of novice to expert practice is a paramount finding. The tightly woven fabric of the 10 processes, when interlaced with one another, provides a strong tapestry supporting novice to expert practice via postprofessional athletic training education. The emergence of this theoretic position pushes postprofessional graduate athletic training education forward to the future for further investigation into the theoretic constructs of novice to expert practice.
Controlled encoding strategies in memory tests in lithium patients.
Opgenoorth, E; Karlick-Bolten, E
1986-03-01
The "levels of processing" theory (Craik and Lockhart) and "dual coding" theory (Paivio) provide new aspects for clinical memory research work. Therefore, an incidental learning paradigm on the basis of these two theoretical approaches was chosen to test aspects of memory performances with lithium therapy. Results of two experiments, with controlled non-semantic processing (rating experiment "comparison of size") and additive semantic processing (rating "living--non-living") indicate a slight reduction in recall (Fig. 1) and recognition performance (Fig. 2) in lithium patients. Effects on encoding strategies are of equal quality in patients and healthy subjects (Tab. 1, 2) but performance differs between both groups: poorer systematic benefit from within code repetitions ("word-word" items, "picture-picture" items) and dual coding (repeated variable item presentation "picture-word") is obtained. The less efficient encoding strategies in the speeded task are discussed with respect to cognitive rigidity and slowing of performance by emotional states. This investigation of so-called "memory deficits" with lithium is an attempt to explore impairments at an early stage of processing; the characterization of the perceptual cognitive analysis seems useful for further clinical research work on this topic.
Uncertainties in (E)UV model atmosphere fluxes
NASA Astrophysics Data System (ADS)
Rauch, T.
2008-04-01
Context: During the comparison of synthetic spectra calculated with two NLTE model atmosphere codes, namely TMAP and TLUSTY, we encounter systematic differences in the EUV fluxes due to the treatment of level dissolution by pressure ionization. Aims: In the case of Sirius B, we demonstrate an uncertainty in modeling the EUV flux reliably in order to challenge theoreticians to improve the theory of level dissolution. Methods: We calculated synthetic spectra for hot, compact stars using state-of-the-art NLTE model-atmosphere techniques. Results: Systematic differences may occur due to a code-specific cutoff frequency of the H I Lyman bound-free opacity. This is the case for TMAP and TLUSTY. Both codes predict the same flux level at wavelengths lower than about 1500 Å for stars with effective temperatures (T_eff) below about 30 000 K only, if the same cutoff frequency is chosen. Conclusions: The theory of level dissolution in high-density plasmas, which is available for hydrogen only should be generalized to all species. Especially, the cutoff frequencies for the bound-free opacities should be defined in order to make predictions of UV fluxes more reliable.
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.
1988-01-01
Extensive correlations of computer code results with experimental data are employed to illustrate the use of linearized theory attached flow methods for the estimation and optimization of the aerodynamic performance of simple hinged flap systems. Use of attached flow methods is based on the premise that high levels of aerodynamic efficiency require a flow that is as nearly attached as circumstances permit. A variety of swept wing configurations are considered ranging from fighters to supersonic transports, all with leading- and trailing-edge flaps for enhancement of subsonic aerodynamic efficiency. The results indicate that linearized theory attached flow computer code methods provide a rational basis for the estimation and optimization of flap system aerodynamic performance at subsonic speeds. The analysis also indicates that vortex flap design is not an opposing approach but is closely related to attached flow design concepts. The successful vortex flap design actually suppresses the formation of detached vortices to produce a small vortex which is restricted almost entirely to the leading edge flap itself.
NASA Technical Reports Server (NTRS)
Carlson, Harry W.; Darden, Christine M.; Mann, Michael J.
1990-01-01
Extensive correlations of computer code results with experimental data are employed to illustrate the use of a linearized theory, attached flow method for the estimation and optimization of the longitudinal aerodynamic performance of wing-canard and wing-horizontal tail configurations which may employ simple hinged flap systems. Use of an attached flow method is based on the premise that high levels of aerodynamic efficiency require a flow that is as nearly attached as circumstances permit. The results indicate that linearized theory, attached flow, computer code methods (modified to include estimated attainable leading-edge thrust and an approximate representation of vortex forces) provide a rational basis for the estimation and optimization of aerodynamic performance at subsonic speeds below the drag rise Mach number. Generally, good prediction of aerodynamic performance, as measured by the suction parameter, can be expected for near optimum combinations of canard or horizontal tail incidence and leading- and trailing-edge flap deflections at a given lift coefficient (conditions which tend to produce a predominantly attached flow).
ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers
NASA Astrophysics Data System (ADS)
Torrent, Marc
2014-03-01
For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization algorithm, as well as the use of external optimized librairies. Part of this work has been supported by the european Prace project (PaRtnership for Advanced Computing in Europe) in the framework of its workpackage 8.
Sawatsky, Adam P; Ratelle, John T; Bonnes, Sara L; Egginton, Jason S; Beckman, Thomas J
2017-02-02
Existing theories of self-directed learning (SDL) have emphasized the importance of process, personal, and contextual factors. Previous medical education research has largely focused on the process of SDL. We explored the experience with and perception of SDL among internal medicine residents to gain understanding of the personal and contextual factors of SDL in graduate medical education. Using a constructivist grounded theory approach, we conducted 7 focus group interviews with 46 internal medicine residents at an academic medical center. We processed the data by using open coding and writing analytic memos. Team members organized open codes to create axial codes, which were applied to all transcripts. Guided by a previous model of SDL, we developed a theoretical model that was revised through constant comparison with new data as they were collected, and we refined the theory until it had adequate explanatory power and was appropriately grounded in the experiences of residents. We developed a theoretical model of SDL to explain the process, personal, and contextual factors affecting SDL during residency training. The process of SDL began with a trigger that uncovered a knowledge gap. Residents progressed to formulating learning objectives, using resources, applying knowledge, and evaluating learning. Personal factors included motivations, individual characteristics, and the change in approach to SDL over time. Contextual factors included the need for external guidance, the influence of residency program structure and culture, and the presence of contextual barriers. We developed a theoretical model of SDL in medical education that can be used to promote and assess resident SDL through understanding the process, person, and context of SDL.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Application of grammar-based codes for lossless compression of digital mammograms
NASA Astrophysics Data System (ADS)
Li, Xiaoli; Krishnan, Srithar; Ma, Ngok-Wah
2006-01-01
A newly developed grammar-based lossless source coding theory and its implementation was proposed in 1999 and 2000, respectively, by Yang and Kieffer. The code first transforms the original data sequence into an irreducible context-free grammar, which is then compressed using arithmetic coding. In the study of grammar-based coding for mammography applications, we encountered two issues: processing time and limited number of single-character grammar G variables. For the first issue, we discover a feature that can simplify the matching subsequence search in the irreducible grammar transform process. Using this discovery, an extended grammar code technique is proposed and the processing time of the grammar code can be significantly reduced. For the second issue, we propose to use double-character symbols to increase the number of grammar variables. Under the condition that all the G variables have the same probability of being used, our analysis shows that the double- and single-character approaches have the same compression rates. By using the methods proposed, we show that the grammar code can outperform three other schemes: Lempel-Ziv-Welch (LZW), arithmetic, and Huffman on compression ratio, and has similar error tolerance capabilities as LZW coding under similar circumstances.
On the Effects of Social Class on Language Use: A Fresh Look at Bernstein's Theory
ERIC Educational Resources Information Center
Aliakbari, Mohammad; Allahmoradi, Nazal
2014-01-01
Basil Bernstein (1971) introduced the notion of the Restricted and the Elaborated code, claiming that working-class speakers have access only to the former but middle-class members to both. In an attempt to test this theory in the Iranian context and to investigate the effect of social class on the quality of students language use, we examined the…
NASA Technical Reports Server (NTRS)
Hanson, D. B.; Mccolgan, C. J.; Ladden, R. M.; Klatte, R. J.
1991-01-01
Results of the program for the generation of a computer prediction code for noise of advanced single rotation, turboprops (prop-fans) such as the SR3 model are presented. The code is based on a linearized theory developed at Hamilton Standard in which aerodynamics and acoustics are treated as a unified process. Both steady and unsteady blade loading are treated. Capabilities include prediction of steady airload distributions and associated aerodynamic performance, unsteady blade pressure response to gust interaction or blade vibration, noise fields associated with thickness and steady and unsteady loading, and wake velocity fields associated with steady loading. The code was developed on the Hamilton Standard IBM computer and has now been installed on the Cray XMP at NASA-Lewis. The work had its genesis in the frequency domain acoustic theory developed at Hamilton Standard in the late 1970s. It was found that the method used for near field noise predictions could be adapted as a lifting surface theory for aerodynamic work via the pressure potential technique that was used for both wings and ducted turbomachinery. In the first realization of the theory for propellers, the blade loading was represented in a quasi-vortex lattice form. This was upgraded to true lifting surface loading. Originally, it was believed that a purely linear approach for both aerodynamics and noise would be adequate. However, two sources of nonlinearity in the steady aerodynamics became apparent and were found to be a significant factor at takeoff conditions. The first is related to the fact that the steady axial induced velocity may be of the same order of magnitude as the flight speed and the second is the formation of leading edge vortices which increases lift and redistribute loading. Discovery and properties of prop-fan leading edge vortices were reported in two papers. The Unified AeroAcoustic Program (UAAP) capabilites are demonstrated and the theory verified by comparison with the predictions with data from tests at NASA-Lewis. Steady aerodyanmic performance, unsteady blade loading, wakes, noise, and wing and boundary layer shielding are examined.
Modeling Close-In Airblast from ANFO Cylindrical and Box-Shaped Charges
2010-10-01
Eulerian hydrodynamics code [1]. The Jones-Wilkins-Lee (JWL) equation of the state (EOS) [2] of the reacted ANFO was computed using the Cheetah ...thermodynamics code [3]. Cheetah first calculates the detonation state from Chapman-Jouget (C-J) theory and then models the adiabatic expansion from...success modeling a large range of ANFO charge sizes using the Cheetah -generated EOS along with the Ignition and Growth (IG) reactive flow model [6
1982-11-01
D- R136 495 RETURN DIFFERENCE FEEDBACK DESIGN FOR ROBUSTj/ UNCERTAINTY TOLERANCE IN STO..(U) UNIVERSITY OF SOUTHERN CALIFORNIA LOS ANGELES DEPT OF...State and ZIP Code) 7. b6 ADORESS (City. Staft and ZIP Code) Department of Electrical Engineering -’M Directorate of Mathematical & Information Systems ...13. SUBJECT TERMS Continur on rverse ineeesaty and identify by block nmber) FIELD GROUP SUE. GR. Systems theory; control; feedback; automatic control
Theory of Coding Informational Simulation.
1981-04-06
reach the valu , cf several thousands; single-prcgressicn represertation of this valu i5. little attractive duc to the unwieldinsss. Here we approachfd a...the moment/torque when contents of location ccunter must be chanqid tc the larger or smaller side. Value and directicn of change are assiqned by the...ths register of transition is formed by the algebraic addition of contained location counter and value of a change in the code of the latter (step
Projectile and Lab Frame Differential Cross Sections for Electromagnetic Dissociation
NASA Technical Reports Server (NTRS)
Norbury, John W.; Adamczyk, Anne; Dick, Frank
2008-01-01
Differential cross sections for electromagnetic dissociation in nuclear collisions are calculated for the first time. In order to be useful for three - dimensional transport codes, these cross sections have been calculated in both the projectile and lab frames. The formulas for these cross sections are such that they can be immediately used in space radiation transport codes. Only a limited amount of data exists, but the comparison between theory and experiment is good.
Cognitive Evaluation Theory: An Experimental Test of Processes and Outcomes.
1981-06-26
ICONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE Organizational Effectiveness Research Program June 26, 1981 Office of Naval Research (Code 452) 13. NUMBER...frame- work for explaining the detrimental effects of performance contingent rewards on intrinsically motivated behaviors. A review of the literature...part by the Organizational Effectiveness RMzarch Program, Office of Naval Research (Code 452), under Contract No. N0014-79-C-O750i NR 170-892 with Dr
Probabilistic Simulation for Nanocomposite Characterization
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Coroneos, Rula M.
2007-01-01
A unique probabilistic theory is described to predict the properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths properties of a mononanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions.
2008-12-01
multiconductor transmission line theory. The per-unit capacitance, inductance , and characteristic impedance matrices generated from the companion LAPLACE...code based on the Method of Moments application, by meshing different sections of the multiconductor cable for capacitance and inductance matrices [21...conductors held together in four pairs and resided in the cable jacket. Each of eight conductors was also designed with the per unit length resistance
An efficient HZETRN (a galactic cosmic ray transport code)
NASA Technical Reports Server (NTRS)
Shinn, Judy L.; Wilson, John W.
1992-01-01
An accurate and efficient engineering code for analyzing the shielding requirements against the high-energy galactic heavy ions is needed. The HZETRN is a deterministic code developed at Langley Research Center that is constantly under improvement both in physics and numerical computation and is targeted for such use. One problem area connected with the space-marching technique used in this code is the propagation of the local truncation error. By improving the numerical algorithms for interpolation, integration, and grid distribution formula, the efficiency of the code is increased by a factor of eight as the number of energy grid points is reduced. The numerical accuracy of better than 2 percent for a shield thickness of 150 g/cm(exp 2) is found when a 45 point energy grid is used. The propagating step size, which is related to the perturbation theory, is also reevaluated.
Wing Weight Optimization Under Aeroelastic Loads Subject to Stress Constraints
NASA Technical Reports Server (NTRS)
Kapania, Rakesh K.; Issac, J.; Macmurdy, D.; Guruswamy, Guru P.
1997-01-01
A minimum weight optimization of the wing under aeroelastic loads subject to stress constraints is carried out. The loads for the optimization are based on aeroelastic trim. The design variables are the thickness of the wing skins and planform variables. The composite plate structural model incorporates first-order shear deformation theory, the wing deflections are expressed using Chebyshev polynomials and a Rayleigh-Ritz procedure is adopted for the structural formulation. The aerodynamic pressures provided by the aerodynamic code at a discrete number of grid points is represented as a bilinear distribution on the composite plate code to solve for the deflections and stresses in the wing. The lifting-surface aerodynamic code FAST is presently being used to generate the pressure distribution over the wing. The envisioned ENSAERO/Plate is an aeroelastic analysis code which combines ENSAERO version 3.0 (for analysis of wing-body configurations) with the composite plate code.
Structures of Neural Correlation and How They Favor Coding
Franke, Felix; Fiscella, Michele; Sevelev, Maksim; Roska, Botond; Hierlemann, Andreas; da Silveira, Rava Azeredo
2017-01-01
Summary The neural representation of information suffers from “noise”—the trial-to-trial variability in the response of neurons. The impact of correlated noise upon population coding has been debated, but a direct connection between theory and experiment remains tenuous. Here, we substantiate this connection and propose a refined theoretical picture. Using simultaneous recordings from a population of direction-selective retinal ganglion cells, we demonstrate that coding benefits from noise correlations. The effect is appreciable already in small populations, yet it is a collective phenomenon. Furthermore, the stimulus-dependent structure of correlation is key. We develop simple functional models that capture the stimulus-dependent statistics. We then use them to quantify the performance of population coding, which depends upon interplays of feature sensitivities and noise correlations in the population. Because favorable structures of correlation emerge robustly in circuits with noisy, nonlinear elements, they will arise and benefit coding beyond the confines of retina. PMID:26796692
NASA Technical Reports Server (NTRS)
Spiers, Gary D.
1994-01-01
Section 1 details the theory used to build the lidar model, provides results of using the model to evaluate AEOLUS design instrument designs, and provides snapshots of the visual appearance of the coded model. Appendix A contains a Fortran program to calculate various forms of the refractive index structure function. This program was used to determine the refractive index structure function used in the main lidar simulation code. Appendix B contains a memo on the optimization of the lidar telescope geometry for a line-scan geometry. Appendix C contains the code for the main lidar simulation and brief instruction on running the code. Appendix D contains a Fortran code to calculate the maximum permissible exposure for the eye from the ANSI Z136.1-1992 eye safety standards. Appendix E contains a paper on the eye safety analysis of a space-based coherent lidar presented at the 7th Coherent Laser Radar Applications and Technology Conference, Paris, France, 19-23 July 1993.
CMCpy: Genetic Code-Message Coevolution Models in Python
Becich, Peter J.; Stark, Brian P.; Bhat, Harish S.; Ardell, David H.
2013-01-01
Code-message coevolution (CMC) models represent coevolution of a genetic code and a population of protein-coding genes (“messages”). Formally, CMC models are sets of quasispecies coupled together for fitness through a shared genetic code. Although CMC models display plausible explanations for the origin of multiple genetic code traits by natural selection, useful modern implementations of CMC models are not currently available. To meet this need we present CMCpy, an object-oriented Python API and command-line executable front-end that can reproduce all published results of CMC models. CMCpy implements multiple solvers for leading eigenpairs of quasispecies models. We also present novel analytical results that extend and generalize applications of perturbation theory to quasispecies models and pioneer the application of a homotopy method for quasispecies with non-unique maximally fit genotypes. Our results therefore facilitate the computational and analytical study of a variety of evolutionary systems. CMCpy is free open-source software available from http://pypi.python.org/pypi/CMCpy/. PMID:23532367
The structure of affective action representations: temporal binding of affective response codes.
Eder, Andreas B; Müsseler, Jochen; Hommel, Bernhard
2012-01-01
Two experiments examined the hypothesis that preparing an action with a specific affective connotation involves the binding of this action to an affective code reflecting this connotation. This integration into an action plan should lead to a temporary occupation of the affective code, which should impair the concurrent representation of affectively congruent events, such as the planning of another action with the same valence. This hypothesis was tested with a dual-task setup that required a speeded choice between approach- and avoidance-type lever movements after having planned and before having executed an evaluative button press. In line with the code-occupation hypothesis, slower lever movements were observed when the lever movement was affectively compatible with the prepared evaluative button press than when the two actions were affectively incompatible. Lever movements related to approach and avoidance and evaluative button presses thus seem to share a code that represents affective meaning. A model of affective action control that is based on the theory of event coding is discussed.
TEA: A Code Calculating Thermochemical Equilibrium Abundances
NASA Astrophysics Data System (ADS)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature-pressure pairs. We tested the code against the method of Burrows & Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows & Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but with higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.
NASA Astrophysics Data System (ADS)
Lourderaj, Upakarasamy; Sun, Rui; Kohale, Swapnil C.; Barnes, George L.; de Jong, Wibe A.; Windus, Theresa L.; Hase, William L.
2014-03-01
The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling since the two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface that accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.
Four-Dimensional Continuum Gyrokinetic Code: Neoclassical Simulation of Fusion Edge Plasmas
NASA Astrophysics Data System (ADS)
Xu, X. Q.
2005-10-01
We are developing a continuum gyrokinetic code, TEMPEST, to simulate edge plasmas. Our code represents velocity space via a grid in equilibrium energy and magnetic moment variables, and configuration space via poloidal magnetic flux and poloidal angle. The geometry is that of a fully diverted tokamak (single or double null) and so includes boundary conditions for both closed magnetic flux surfaces and open field lines. The 4-dimensional code includes kinetic electrons and ions, and electrostatic field-solver options, and simulates neoclassical transport. The present implementation is a Method of Lines approach where spatial finite-differences (higher order upwinding) and implicit time advancement are used. We present results of initial verification and validation studies: transition from collisional to collisionless limits of parallel end-loss in the scrape-off layer, self-consistent electric field, and the effect of the real X-point geometry and edge plasma conditions on the standard neoclassical theory, including a comparison of our 4D code with other kinetic neoclassical codes and experiments.
TEA: A CODE CALCULATING THERMOCHEMICAL EQUILIBRIUM ABUNDANCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blecic, Jasmina; Harrington, Joseph; Bowman, M. Oliver, E-mail: jasmina@physics.ucf.edu
2016-07-01
We present an open-source Thermochemical Equilibrium Abundances (TEA) code that calculates the abundances of gaseous molecular species. The code is based on the methodology of White et al. and Eriksson. It applies Gibbs free-energy minimization using an iterative, Lagrangian optimization scheme. Given elemental abundances, TEA calculates molecular abundances for a particular temperature and pressure or a list of temperature–pressure pairs. We tested the code against the method of Burrows and Sharp, the free thermochemical equilibrium code Chemical Equilibrium with Applications (CEA), and the example given by Burrows and Sharp. Using their thermodynamic data, TEA reproduces their final abundances, but withmore » higher precision. We also applied the TEA abundance calculations to models of several hot-Jupiter exoplanets, producing expected results. TEA is written in Python in a modular format. There is a start guide, a user manual, and a code document in addition to this theory paper. TEA is available under a reproducible-research, open-source license via https://github.com/dzesmin/TEA.« less
NASA Astrophysics Data System (ADS)
Marcolongo, Juan P.; Zeida, Ari; Semelak, Jonathan A.; Foglia, Nicolás O.; Morzan, Uriel N.; Estrin, Dario A.; González Lebrero, Mariano C.; Scherlis, Damián A.
2018-03-01
In this work we present the current advances in the development and the applications of LIO, a lab-made code designed for density functional theory calculations in graphical processing units (GPU), that can be coupled with different classical molecular dynamics engines. This code has been thoroughly optimized to perform efficient molecular dynamics simulations at the QM/MM DFT level, allowing for an exhaustive sampling of the configurational space. Selected examples are presented for the description of chemical reactivity in terms of free energy profiles, and also for the computation of optical properties, such as vibrational and electronic spectra in solvent and protein environments.
NASA Technical Reports Server (NTRS)
Zimmerman, Martin L.
1995-01-01
This manual explains the theory and operation of the finite-difference time domain code FDTD-ANT developed by Analex Corporation at the NASA Lewis Research Center in Cleveland, Ohio. This code can be used for solving electromagnetic problems that are electrically small or medium (on the order of 1 to 50 cubic wavelengths). Calculated parameters include transmission line impedance, relative effective permittivity, antenna input impedance, and far-field patterns in both the time and frequency domains. The maximum problem size may be adjusted according to the computer used. This code has been run on the DEC VAX and 486 PC's and on workstations such as the Sun Sparc and the IBM RS/6000.
Low-complexity video encoding method for wireless image transmission in capsule endoscope.
Takizawa, Kenichi; Hamaguchi, Kiyoshi
2010-01-01
This paper presents a low-complexity video encoding method applicable for wireless image transmission in capsule endoscopes. This encoding method is based on Wyner-Ziv theory, in which side information available at a transmitter is treated as side information at its receiver. Therefore complex processes in video encoding, such as estimation of the motion vector, are moved to the receiver side, which has a larger-capacity battery. As a result, the encoding process is only to decimate coded original data through channel coding. We provide a performance evaluation for a low-density parity check (LDPC) coding method in the AWGN channel.
Towards a theory of intention: An application of quantum mechanics within psychotherapy
NASA Astrophysics Data System (ADS)
Van Wyck, Jennifer
This study incorporated grounded research methodology to analyze and code three fields of research: psychoneuroimmunology, psychokinesis, and guided imagery. The works of Tiller (2001, 2007) and Dyer (2004) were used as a validity check for the grounded theory results and provided further input into a final theory of intention. It was found that intention requires three elements to be most successful in producing targeted outcomes. These include consciousness, thought, and emotion. The emotional aspect of intention had previously been mentioned but never incorporated into earlier theories of intention and appears to be a new finding that has potentially strong implications. The paper concludes with a discussion of how the theory of intention can inform practice in the field of psychotherapy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brandao, C. S. S.; De Araujo, J. C. N., E-mail: claudiosoriano.uesc@gmail.com, E-mail: jcarlos.dearaujo@inpe.br
2012-05-01
A way to probe alternative theories of gravitation is to study if they could account for the structures of the universe. We therefore modified the well-known Gadget-2 code to probe alternative theories of gravitation through galactic dynamics. As an application, we simulate the evolution of spiral galaxies to probe alternative theories of gravitation whose weak field limits have a Yukawa-like gravitational potential. These simulations show that galactic dynamics can be used to constrain the parameters associated with alternative theories of gravitation. It is worth stressing that the recipe given in this study can be applied to any other alternative theorymore » of gravitation in which the superposition principle is valid.« less
A comparison of different methods to implement higher order derivatives of density functionals
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Dam, Hubertus J.J.
Density functional theory is the dominant approach in electronic structure methods today. To calculate properties higher order derivatives of the density functionals are required. These derivatives might be implemented manually,by automatic differentiation, or by symbolic algebra programs. Different authors have cited different reasons for using the particular method of their choice. This paper presents work where all three approaches were used and the strengths and weaknesses of each approach are considered. It is found that all three methods produce code that is suffficiently performanted for practical applications, despite the fact that our symbolic algebra generated code and our automatic differentiationmore » code still have scope for significant optimization. The automatic differentiation approach is the best option for producing readable and maintainable code.« less
Anderson, John R; Betts, Shawn; Ferris, Jennifer L; Fincham, Jon M
2011-03-01
Students were taught an algorithm for solving a new class of mathematical problems. Occasionally in the sequence of problems, they encountered exception problems that required that they extend the algorithm. Regular and exception problems were associated with different patterns of brain activation. Some regions showed a Cognitive pattern of being active only until the problem was solved and no difference between regular or exception problems. Other regions showed a Metacognitive pattern of greater activity for exception problems and activity that extended into the post-solution period, particularly when an error was made. The Cognitive regions included some of parietal and prefrontal regions associated with the triple-code theory of (Dehaene, S., Piazza, M., Pinel, P., & Cohen, L. (2003). Three parietal circuits for number processing. Cognitive Neuropsychology, 20, 487-506) and associated with algebra equation solving in the ACT-R theory (Anderson, J. R. (2005). Human symbol manipulation within an 911 integrated cognitive architecture. Cognitive science, 29, 313-342. Metacognitive regions included the superior prefrontal gyrus, the angular gyrus of the triple-code theory, and frontopolar regions.
Quantum steganography and quantum error-correction
NASA Astrophysics Data System (ADS)
Shaw, Bilal A.
Quantum error-correcting codes have been the cornerstone of research in quantum information science (QIS) for more than a decade. Without their conception, quantum computers would be a footnote in the history of science. When researchers embraced the idea that we live in a world where the effects of a noisy environment cannot completely be stripped away from the operations of a quantum computer, the natural way forward was to think about importing classical coding theory into the quantum arena to give birth to quantum error-correcting codes which could help in mitigating the debilitating effects of decoherence on quantum data. We first talk about the six-qubit quantum error-correcting code and show its connections to entanglement-assisted error-correcting coding theory and then to subsystem codes. This code bridges the gap between the five-qubit (perfect) and Steane codes. We discuss two methods to encode one qubit into six physical qubits. Each of the two examples corrects an arbitrary single-qubit error. The first example is a degenerate six-qubit quantum error-correcting code. We explicitly provide the stabilizer generators, encoding circuits, codewords, logical Pauli operators, and logical CNOT operator for this code. We also show how to convert this code into a non-trivial subsystem code that saturates the subsystem Singleton bound. We then prove that a six-qubit code without entanglement assistance cannot simultaneously possess a Calderbank-Shor-Steane (CSS) stabilizer and correct an arbitrary single-qubit error. A corollary of this result is that the Steane seven-qubit code is the smallest single-error correcting CSS code. Our second example is the construction of a non-degenerate six-qubit CSS entanglement-assisted code. This code uses one bit of entanglement (an ebit) shared between the sender (Alice) and the receiver (Bob) and corrects an arbitrary single-qubit error. The code we obtain is globally equivalent to the Steane seven-qubit code and thus corrects an arbitrary error on the receiver's half of the ebit as well. We prove that this code is the smallest code with a CSS structure that uses only one ebit and corrects an arbitrary single-qubit error on the sender's side. We discuss the advantages and disadvantages for each of the two codes. In the second half of this thesis we explore the yet uncharted and relatively undiscovered area of quantum steganography. Steganography is the process of hiding secret information by embedding it in an "innocent" message. We present protocols for hiding quantum information in a codeword of a quantum error-correcting code passing through a channel. Using either a shared classical secret key or shared entanglement Alice disguises her information as errors in the channel. Bob can retrieve the hidden information, but an eavesdropper (Eve) with the power to monitor the channel, but without the secret key, cannot distinguish the message from channel noise. We analyze how difficult it is for Eve to detect the presence of secret messages, and estimate rates of steganographic communication and secret key consumption for certain protocols. We also provide an example of how Alice hides quantum information in the perfect code when the underlying channel between Bob and her is the depolarizing channel. Using this scheme Alice can hide up to four stego-qubits.
ERIC Educational Resources Information Center
Downs, Edward; Boyson, Aaron R.; Alley, Hannah; Bloom, Nikki R.
2011-01-01
Some institutions of higher learning have invested considerable resources to diffuse iPods and MP3 devices though little is known about learning outcomes tied to their use. Dual-coding and multimedia learning theories guided the development of a typical college lecture so that it could be presented in a combination of audio and visual forms across…
High frequency scattering from a thin lossless dielectric slab. M.S. Thesis
NASA Technical Reports Server (NTRS)
Burgener, K. W.
1979-01-01
A solution for scattering from a thin dielectric slab is developed based on geometrical optics and the geometrical theory of diffraction with the intention of developing a model for a windshield of a small private aircraft for incorporation in an aircraft antenna code. Results of the theory are compared with experimental measurements and moment method calculations showing good agreement. Application of the solution is also addressed.
ERIC Educational Resources Information Center
Rosenberg, Nancy S.
A group is viewed to be one of the simplest and most interesting algebraic structures. The theory of groups has been applied to many branches of mathematics as well as to crystallography, coding theory, quantum mechanics, and the physics of elementary particles. This material is designed to help the user: 1) understand what groups are and why they…
Parrish, Robert M; Burns, Lori A; Smith, Daniel G A; Simmonett, Andrew C; DePrince, A Eugene; Hohenstein, Edward G; Bozkaya, Uğur; Sokolov, Alexander Yu; Di Remigio, Roberto; Richard, Ryan M; Gonthier, Jérôme F; James, Andrew M; McAlexander, Harley R; Kumar, Ashutosh; Saitow, Masaaki; Wang, Xiao; Pritchard, Benjamin P; Verma, Prakash; Schaefer, Henry F; Patkowski, Konrad; King, Rollin A; Valeev, Edward F; Evangelista, Francesco A; Turney, Justin M; Crawford, T Daniel; Sherrill, C David
2017-07-11
Psi4 is an ab initio electronic structure program providing methods such as Hartree-Fock, density functional theory, configuration interaction, and coupled-cluster theory. The 1.1 release represents a major update meant to automate complex tasks, such as geometry optimization using complete-basis-set extrapolation or focal-point methods. Conversion of the top-level code to a Python module means that Psi4 can now be used in complex workflows alongside other Python tools. Several new features have been added with the aid of libraries providing easy access to techniques such as density fitting, Cholesky decomposition, and Laplace denominators. The build system has been completely rewritten to simplify interoperability with independent, reusable software components for quantum chemistry. Finally, a wide range of new theoretical methods and analyses have been added to the code base, including functional-group and open-shell symmetry adapted perturbation theory, density-fitted coupled cluster with frozen natural orbitals, orbital-optimized perturbation and coupled-cluster methods (e.g., OO-MP2 and OO-LCCD), density-fitted multiconfigurational self-consistent field, density cumulant functional theory, algebraic-diagrammatic construction excited states, improvements to the geometry optimizer, and the "X2C" approach to relativistic corrections, among many other improvements.
Multimedia Learning: Beyond Modality. Commentary.
ERIC Educational Resources Information Center
Reimann, P.
2003-01-01
Identifies and summarizes instructional messages in the articles in this theme issue and also identifies central theoretical issues, focusing on: (1) external representations; (2) dual coding theory; and (3) the effects of animations on learning. (SLD)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bozinoski, Radoslav; Winters, William
2016-01-01
The purpose of this report is to document the theoretical models utilized by the computer code NETFLOW. This report will focus on the theoretical models used to analyze high Mach number fully compressible transonic flows in piping networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ivanov, A. A., E-mail: aai@a5.kiam.ru; Martynov, A. A., E-mail: martynov@a5.kiam.ru; Medvedev, S. Yu., E-mail: medvedev@a5.kiam.ru
In the MHD tokamak plasma theory, the plasma pressure is usually assumed to be isotropic. However, plasma heating by neutral beam injection and RF heating can lead to a strong anisotropy of plasma parameters and rotation of the plasma. The development of MHD equilibrium theory taking into account the plasma inertia and anisotropic pressure began a long time ago, but until now it has not been consistently applied in computational codes for engineering calculations of the plasma equilibrium and evolution in tokamak. This paper contains a detailed derivation of the axisymmetric plasma equilibrium equation in the most general form (withmore » arbitrary rotation and anisotropic pressure) and description of the specialized version of the SPIDER code. The original method of calculation of the equilibrium with an anisotropic pressure and a prescribed rotational transform profile is proposed. Examples of calculations and discussion of the results are also presented.« less
A model for the accurate computation of the lateral scattering of protons in water
NASA Astrophysics Data System (ADS)
Bellinzona, E. V.; Ciocca, M.; Embriaco, A.; Ferrari, A.; Fontana, A.; Mairani, A.; Parodi, K.; Rotondi, A.; Sala, P.; Tessonnier, T.
2016-02-01
A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.
Generalized Wall Function for Complex Turbulent Flows
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Povinelli, Louis A.; Liu, Nan-Suey; Chen, Kuo-Huey
2000-01-01
A generalized wall function was proposed by Shih et al., (1999). It accounts the effect of pressure gradients on the flow near the wall. Theory shows that the effect of pressure gradients on the flow in the inertial sublayer is very significant and the standard wall function should be replaced by a generalized wall function. Since the theory is also valid for boundary layer flows toward separation, the generalized wall function may be applied to complex turbulent flows with acceleration, deceleration, separation and recirculation. This paper is to verify the generalized wall function with numerical simulations for boundary layer flows with various adverse and favorable pressure gradients, including flows about to separate. Furthermore, a general procedure of implementation of the generalized wall function for National Combustion Code (NCC) is described, it can be applied to both structured and unstructured CFD codes.
Structural response of existing spatial truss roof construction based on Cosserat rod theory
NASA Astrophysics Data System (ADS)
Miśkiewicz, Mikołaj
2018-04-01
Paper presents the application of the Cosserat rod theory and newly developed associated finite elements code as the tools that support in the expert-designing engineering practice. Mechanical principles of the 3D spatially curved rods, dynamics (statics) laws, principle of virtual work are discussed. Corresponding FEM approach with interpolation and accumulation techniques of state variables are shown that enable the formulation of the C0 Lagrangian rod elements with 6-degrees of freedom per node. Two test examples are shown proving the correctness and suitability of the proposed formulation. Next, the developed FEM code is applied to assess the structural response of the spatial truss roof of the "Olivia" Sports Arena Gdansk, Poland. The numerical results are compared with load test results. It is shown that the proposed FEM approach yields correct results.
A General Sparse Tensor Framework for Electronic Structure Theory
Manzer, Samuel; Epifanovsky, Evgeny; Krylov, Anna I.; ...
2017-01-24
Linear-scaling algorithms must be developed in order to extend the domain of applicability of electronic structure theory to molecules of any desired size. But, the increasing complexity of modern linear-scaling methods makes code development and maintenance a significant challenge. A major contributor to this difficulty is the lack of robust software abstractions for handling block-sparse tensor operations. We therefore report the development of a highly efficient symbolic block-sparse tensor library in order to provide access to high-level software constructs to treat such problems. Our implementation supports arbitrary multi-dimensional sparsity in all input and output tensors. We then avoid cumbersome machine-generatedmore » code by implementing all functionality as a high-level symbolic C++ language library and demonstrate that our implementation attains very high performance for linear-scaling sparse tensor contractions.« less
A model for the accurate computation of the lateral scattering of protons in water.
Bellinzona, E V; Ciocca, M; Embriaco, A; Ferrari, A; Fontana, A; Mairani, A; Parodi, K; Rotondi, A; Sala, P; Tessonnier, T
2016-02-21
A pencil beam model for the calculation of the lateral scattering in water of protons for any therapeutic energy and depth is presented. It is based on the full Molière theory, taking into account the energy loss and the effects of mixtures and compounds. Concerning the electromagnetic part, the model has no free parameters and is in very good agreement with the FLUKA Monte Carlo (MC) code. The effects of the nuclear interactions are parametrized with a two-parameter tail function, adjusted on MC data calculated with FLUKA. The model, after the convolution with the beam and the detector response, is in agreement with recent proton data in water from HIT. The model gives results with the same accuracy of the MC codes based on Molière theory, with a much shorter computing time.
Surfing a spike wave down the ventral stream.
VanRullen, Rufin; Thorpe, Simon J
2002-10-01
Numerous theories of neural processing, often motivated by experimental observations, have explored the computational properties of neural codes based on the absolute or relative timing of spikes in spike trains. Spiking neuron models and theories however, as well as their experimental counterparts, have generally been limited to the simulation or observation of isolated neurons, isolated spike trains, or reduced neural populations. Such theories would therefore seem inappropriate to capture the properties of a neural code relying on temporal spike patterns distributed across large neuronal populations. Here we report a range of computer simulations and theoretical considerations that were designed to explore the possibilities of one such code and its relevance for visual processing. In a unified framework where the relation between stimulus saliency and spike relative timing plays the central role, we describe how the ventral stream of the visual system could process natural input scenes and extract meaningful information, both rapidly and reliably. The first wave of spikes generated in the retina in response to a visual stimulation carries information explicitly in its spatio-temporal structure: the most salient information is represented by the first spikes over the population. This spike wave, propagating through a hierarchy of visual areas, is regenerated at each processing stage, where its temporal structure can be modified by (i). the selectivity of the cortical neurons, (ii). lateral interactions and (iii). top-down attentional influences from higher order cortical areas. The resulting model could account for the remarkable efficiency and rapidity of processing observed in the primate visual system.
NASA Astrophysics Data System (ADS)
Kos, Agnieszka
The demands of national educational reforms require high school biology teachers to provide high quality instruction to students with and without special needs. The reforms, however, do not provide teachers with adequate teaching strategies to meet the needs of all students in the same context. The purpose of this grounded theory study was to understand high school biology teachers' perspectives, practices, and challenges in relation to teaching students with special needs. This approach was used to develop a substantive model for high school biology teachers who are challenged with teaching students with and without special needs. Data were collected via in-depth interviews with 15 high school teachers in a Midwestern school district. The data were analyzed using open coding, axial coding, and selective coding procedures in accordance with the grounded theory approach. Essential model components included skills and training for teachers, classroom management strategies, teaching strategies, and student skills. The emergent substantive theory indicated that that teacher preparation and acquired skills greatly influence the effectiveness of inclusion implementation. Key findings also indicated the importance of using of a variety of instructional strategies and classroom management strategies that address students' special needs and their learning styles. This study contributes to social change by providing a model for teaching students and effectively implementing inclusion in regular science classrooms. Following further study, this model may be used to support teacher professional development and improve teaching practices that in turn may improve science literacy supported by the national educational reforms.
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2014-01-01
Simulation codes often utilize finite-dimensional approximation resulting in numerical error. Some examples include, numerical methods utilizing grids and finite-dimensional basis functions, particle methods using a finite number of particles. These same simulation codes also often contain sources of uncertainty, for example, uncertain parameters and fields associated with the imposition of initial and boundary data,uncertain physical model parameters such as chemical reaction rates, mixture model parameters, material property parameters, etc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vondy, D.R.; Fowler, T.B.; Cunningham, G.W.
1979-07-01
User input data requirements are presented for certain special processors in a nuclear reactor computation system. These processors generally read data in formatted form and generate binary interface data files. Some data processing is done to convert from the user oriented form to the interface file forms. The VENTURE diffusion theory neutronics code and other computation modules in this system use the interface data files which are generated.
On a Mathematical Theory of Coded Exposure
2014-08-01
formulae that give the MSE and SNR of the final crisp image 1. Assumes the Shannon-Whittaker framework that i) requires band limited (with a fre...represents the ideal crisp image, i.e., the image that one would observed if there were no noise whatsoever, no motion, with a perfect optical system...discrete. In addition, the image obtained by a coded exposure camera requires to undergo a deconvolution to get the final crisp image. Note that the
TAIR: A transonic airfoil analysis computer code
NASA Technical Reports Server (NTRS)
Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.
1981-01-01
The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.
Psychometric challenges and proposed solutions when scoring facial emotion expression codes.
Olderbak, Sally; Hildebrandt, Andrea; Pinkpank, Thomas; Sommer, Werner; Wilhelm, Oliver
2014-12-01
Coding of facial emotion expressions is increasingly performed by automated emotion expression scoring software; however, there is limited discussion on how best to score the resulting codes. We present a discussion of facial emotion expression theories and a review of contemporary emotion expression coding methodology. We highlight methodological challenges pertinent to scoring software-coded facial emotion expression codes and present important psychometric research questions centered on comparing competing scoring procedures of these codes. Then, on the basis of a time series data set collected to assess individual differences in facial emotion expression ability, we derive, apply, and evaluate several statistical procedures, including four scoring methods and four data treatments, to score software-coded emotion expression data. These scoring procedures are illustrated to inform analysis decisions pertaining to the scoring and data treatment of other emotion expression questions and under different experimental circumstances. Overall, we found applying loess smoothing and controlling for baseline facial emotion expression and facial plasticity are recommended methods of data treatment. When scoring facial emotion expression ability, maximum score is preferred. Finally, we discuss the scoring methods and data treatments in the larger context of emotion expression research.
Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions
Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...
2015-11-01
Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less
H-division quarterly report, October--December 1977. [Lawrence Livermore Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-02-10
The Theoretical EOS Group develops theoretical techniques for describing material properties under extreme conditions and constructs equation-of-state (EOS) tables for specific applications. Work this quarter concentrated on a Li equation of state, equation of state for equilibrium plasma, improved ion corrections to the Thomas--Fermi--Kirzhnitz theory, and theoretical estimates of high-pressure melting in metals. The Experimental Physics Group investigates properties of materials at extreme conditions of pressure and temperature, and develops new experimental techniques. Effort this quarter concerned the following: parabolic projectile distortion in the two-state light-gas gun, construction of a ballistic range for long-rod penetrators, thermodynamics and sound velocities inmore » liquid metals, isobaric expansion measurements in Pt, and calculation of the velocity--mass profile of a jet produced by a shaped charge. Code development was concentrated on the PELE code, a multimaterial, multiphase, explicit finite-difference Eulerian code for pool suppression dynamics of a hypothetical loss-of-coolant accident in a nuclear reactor. Activities of the Fluid Dynamics Group were directed toward development of a code to compute the equations of state and transport properties of liquid metals (e.g. Li) and partially ionized dense plasmas, jet stability in the Li reactor system, and the study and problem application of fluid dynamic turbulence theory. 19 figures, 5 tables. (RWR)« less
Penningroth, Suzanna L; Scott, Walter D
2012-01-01
Two prominent theories of lifespan development, socioemotional selectivity theory and selection, optimization, and compensation theory, make similar predictions for differences in the goal representations of younger and older adults. Our purpose was to test whether the goals of younger and older adults differed in ways predicted by these two theories. Older adults and two groups of younger adults (college students and non-students) listed their current goals, which were then coded by independent raters. Observed age group differences in goals generally supported both theories. Specifically, when compared to younger adults, older adults reported more goals focused on maintenance/loss prevention, the present, emotion-focus and generativity, and social selection, and less goals focused on knowledge acquisition and the future. However, contrary to prediction, older adults also showed less goal focusing than younger adults, reporting goals from a broader set of life domains (e.g., health, property/possessions, friendship).
Modeling self on others: An import theory of subjectivity and selfhood.
Prinz, Wolfgang
2017-03-01
This paper outlines an Import Theory of subjectivity and selfhood. Import theory claims that subjectivity is initially perceived as a key feature of other minds before it then becomes imported from other minds to own minds whereby it lays the ground for mental selfhood. Import theory builds on perception-production matching, which in turn draws on both representational mechanisms and social practices. Representational mechanisms rely on common coding of perception and production. Social practices rely on action mirroring in dyadic interactions. The interplay between mechanisms and practices gives rise to model self on others. Individuals become intentional agents in virtue of perceiving others mirroring themselves. The outline of the theory is preceded by an introductory section that locates import theory in the broader context of competing approaches, and it is followed by a concluding section that assesses import theory in terms of empirical evidence and explanatory power. Copyright © 2017 Elsevier Inc. All rights reserved.
Fractonic line excitations: An inroad from three-dimensional elasticity theory
NASA Astrophysics Data System (ADS)
Pai, Shriya; Pretko, Michael
2018-06-01
We demonstrate the existence of a fundamentally new type of excitation, fractonic lines, which are linelike excitations with the restricted mobility properties of fractons. These excitations, described using an amalgamation of higher-form gauge theories with symmetric tensor gauge theories, see direct physical realization as the topological lattice defects of ordinary three-dimensional quantum crystals. Starting with the more familiar elasticity theory, we show how theory maps onto a rank-4 tensor gauge theory, with phonons corresponding to gapless gauge modes and disclination defects corresponding to linelike charges. We derive flux conservation laws which lock these linelike excitations in place, analogous to the higher moment charge conservation laws of fracton theories. This way of encoding mobility restrictions of lattice defects could shed light on melting transitions in three dimensions. This new type of extended object may also be a useful tool in the search for improved quantum error-correcting codes in three dimensions.
Neural Elements for Predictive Coding.
Shipp, Stewart
2016-01-01
Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many 'illusory' instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry - that neurons with extrinsically bifurcating axons do not project in both directions - has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic 'canonical microcircuit' and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural engineering in the mouse. The exercise highlights a number of recurring themes, amongst them the consideration of interneuron diversity as a spur to theoretical development and the potential for specifying a pyramidal neuron's function by its individual 'connectome,' combining its extrinsic projection (forward, backward or subcortical) with evaluation of its intrinsic network (e.g., unidirectional versus bidirectional connections with other pyramidal neurons).
Neural Elements for Predictive Coding
Shipp, Stewart
2016-01-01
Predictive coding theories of sensory brain function interpret the hierarchical construction of the cerebral cortex as a Bayesian, generative model capable of predicting the sensory data consistent with any given percept. Predictions are fed backward in the hierarchy and reciprocated by prediction error in the forward direction, acting to modify the representation of the outside world at increasing levels of abstraction, and so to optimize the nature of perception over a series of iterations. This accounts for many ‘illusory’ instances of perception where what is seen (heard, etc.) is unduly influenced by what is expected, based on past experience. This simple conception, the hierarchical exchange of prediction and prediction error, confronts a rich cortical microcircuitry that is yet to be fully documented. This article presents the view that, in the current state of theory and practice, it is profitable to begin a two-way exchange: that predictive coding theory can support an understanding of cortical microcircuit function, and prompt particular aspects of future investigation, whilst existing knowledge of microcircuitry can, in return, influence theoretical development. As an example, a neural inference arising from the earliest formulations of predictive coding is that the source populations of forward and backward pathways should be completely separate, given their functional distinction; this aspect of circuitry – that neurons with extrinsically bifurcating axons do not project in both directions – has only recently been confirmed. Here, the computational architecture prescribed by a generalized (free-energy) formulation of predictive coding is combined with the classic ‘canonical microcircuit’ and the laminar architecture of hierarchical extrinsic connectivity to produce a template schematic, that is further examined in the light of (a) updates in the microcircuitry of primate visual cortex, and (b) rapid technical advances made possible by transgenic neural engineering in the mouse. The exercise highlights a number of recurring themes, amongst them the consideration of interneuron diversity as a spur to theoretical development and the potential for specifying a pyramidal neuron’s function by its individual ‘connectome,’ combining its extrinsic projection (forward, backward or subcortical) with evaluation of its intrinsic network (e.g., unidirectional versus bidirectional connections with other pyramidal neurons). PMID:27917138
Method development at Nordic School of Public Health NHV: Phenomenology and Grounded Theory.
Strandmark, Margaretha
2015-08-01
Qualitative methods such as phenomenology and grounded theory have been valuable tools in studying public health problems. A description and comparison of these methods. Phenomenology emphasises an inside perspective in form of consciousness and subjectively lived experiences, whereas grounded theory emanates from the idea that interactions between people create new insights and knowledge. Fundamental aspects of phenomenology include life world, consciousness, phenomenological reduction and essence. Significant elements in grounded theory are coding, categories and core categories, which develop a theory. There are differences in the philosophical approach, the name of the concept and the systematic tools between the methods. Thus, the phenomenological method is appropriate when studying emotional and existential research problems, and grounded theory is a method more suited to investigate processes. © 2015 the Nordic Societies of Public Health.
VENTURE/PC manual: A multidimensional multigroup neutron diffusion code system. Version 3
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shapiro, A.; Huria, H.C.; Cho, K.W.
1991-12-01
VENTURE/PC is a recompilation of part of the Oak Ridge BOLD VENTURE code system, which will operate on an IBM PC or compatible computer. Neutron diffusion theory solutions are obtained for multidimensional, multigroup problems. This manual contains information associated with operating the code system. The purpose of the various modules used in the code system, and the input for these modules are discussed. The PC code structure is also given. Version 2 included several enhancements not given in the original version of the code. In particular, flux iterations can be done in core rather than by reading and writing tomore » disk, for problems which allow sufficient memory for such in-core iterations. This speeds up the iteration process. Version 3 does not include any of the special processors used in the previous versions. These special processors utilized formatted input for various elements of the code system. All such input data is now entered through the Input Processor, which produces standard interface files for the various modules in the code system. In addition, a Standard Interface File Handbook is included in the documentation which is distributed with the code, to assist in developing the input for the Input Processor.« less
Toward a Unified Sub-symbolic Computational Theory of Cognition
Butz, Martin V.
2016-01-01
This paper proposes how various disciplinary theories of cognition may be combined into a unifying, sub-symbolic, computational theory of cognition. The following theories are considered for integration: psychological theories, including the theory of event coding, event segmentation theory, the theory of anticipatory behavioral control, and concept development; artificial intelligence and machine learning theories, including reinforcement learning and generative artificial neural networks; and theories from theoretical and computational neuroscience, including predictive coding and free energy-based inference. In the light of such a potential unification, it is discussed how abstract cognitive, conceptualized knowledge and understanding may be learned from actively gathered sensorimotor experiences. The unification rests on the free energy-based inference principle, which essentially implies that the brain builds a predictive, generative model of its environment. Neural activity-oriented inference causes the continuous adaptation of the currently active predictive encodings. Neural structure-oriented inference causes the longer term adaptation of the developing generative model as a whole. Finally, active inference strives for maintaining internal homeostasis, causing goal-directed motor behavior. To learn abstract, hierarchical encodings, however, it is proposed that free energy-based inference needs to be enhanced with structural priors, which bias cognitive development toward the formation of particular, behaviorally suitable encoding structures. As a result, it is hypothesized how abstract concepts can develop from, and thus how they are structured by and grounded in, sensorimotor experiences. Moreover, it is sketched-out how symbol-like thought can be generated by a temporarily active set of predictive encodings, which constitute a distributed neural attractor in the form of an interactive free-energy minimum. The activated, interactive network attractor essentially characterizes the semantics of a concept or a concept composition, such as an actual or imagined situation in our environment. Temporal successions of attractors then encode unfolding semantics, which may be generated by a behavioral or mental interaction with an actual or imagined situation in our environment. Implications, further predictions, possible verification, and falsifications, as well as potential enhancements into a fully spelled-out unified theory of cognition are discussed at the end of the paper. PMID:27445895
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, K. S.; Nakae, L. F.; Prasad, M. K.
Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less
One-Dimensional Czedli-Type Islands
ERIC Educational Resources Information Center
Horvath, Eszter K.; Mader, Attila; Tepavcevic, Andreja
2011-01-01
The notion of an island has surfaced in recent algebra and coding theory research. Discrete versions provide interesting combinatorial problems. This paper presents the one-dimensional case with finitely many heights, a topic convenient for student research.
Multigrid based First-Principles Molecular Dynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fattebert, Jean-Luc; Osei-Kuffuor, Daniel; Dunn, Ian
2017-06-01
MGmol ls a First-Principles Molecular Dynamics code. It relies on the Born-Oppenheimer approximation and models the electronic structure using Density Functional Theory, either LDA or PBE. Norm-conserving pseudopotentials are used to model atomic cores.
NASA Astrophysics Data System (ADS)
Braun, Mario
2015-06-01
The Quartet Theory of Emotion [13] is the first emotion theory to include language as part of its four affect systems allocating two functions of language in emotion processing: communication and regulation. Both are supposed to occur late during the emotion process and by translation or reconfiguration of a pre-verbal emotion percept into a symbolic language code which then gives rise to the conscious experience of an emotion allowing communication or regulation [14] of a felt emotion.
A discrete Fourier transform for virtual memory machines
NASA Technical Reports Server (NTRS)
Galant, David C.
1992-01-01
An algebraic theory of the Discrete Fourier Transform is developed in great detail. Examination of the details of the theory leads to a computationally efficient fast Fourier transform for the use on computers with virtual memory. Such an algorithm is of great use on modern desktop machines. A FORTRAN coded version of the algorithm is given for the case when the sequence of numbers to be transformed is a power of two.
2003-04-01
gener- ally considered to be passive data . Instead the genetic material should be capable of being algorith - mic information, that is, program code or...information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other
Chromatic aberration and the roles of double-opponent and color-luminance neurons in color vision.
Vladusich, Tony
2007-03-01
How does the visual cortex encode color? I summarize a theory in which cortical double-opponent color neurons perform a role in color constancy and a complementary set of color-luminance neurons function to selectively correct for color fringes induced by chromatic aberration in the eye. The theory may help to resolve an ongoing debate concerning the functional properties of cortical receptive fields involved in color coding.
Applying thematic analysis theory to practice: a researcher's experience.
Tuckett, Anthony G
2005-01-01
This article describes an experience of thematic analysis. In order to answer the question 'What does analysis look like in practice?' it describes in brief how the methodology of grounded theory, the epistemology of social constructionism, and the theoretical stance of symbolic interactionism inform analysis. Additionally, analysis is examined by evidencing the systematic processes--here termed organising, coding, writing, theorising, and reading--that led the researcher to develop a final thematic schema.
NASA Astrophysics Data System (ADS)
Couvreur, A.
2009-05-01
The theory of algebraic-geometric codes has been developed in the beginning of the 80's after a paper of V.D. Goppa. Given a smooth projective algebraic curve X over a finite field, there are two different constructions of error-correcting codes. The first one, called "functional", uses some rational functions on X and the second one, called "differential", involves some rational 1-forms on this curve. Hundreds of papers are devoted to the study of such codes. In addition, a generalization of the functional construction for algebraic varieties of arbitrary dimension is given by Y. Manin in an article of 1984. A few papers about such codes has been published, but nothing has been done concerning a generalization of the differential construction to the higher-dimensional case. In this thesis, we propose a differential construction of codes on algebraic surfaces. Afterwards, we study the properties of these codes and particularly their relations with functional codes. A pretty surprising fact is that a main difference with the case of curves appears. Indeed, if in the case of curves, a differential code is always the orthogonal of a functional one, this assertion generally fails for surfaces. Last observation motivates the study of codes which are the orthogonal of some functional code on a surface. Therefore, we prove that, under some condition on the surface, these codes can be realized as sums of differential codes. Moreover, we show that some answers to some open problems "a la Bertini" could give very interesting informations on the parameters of these codes.
EMPIRE: A Reaction Model Code for Nuclear Astrophysics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palumbo, A., E-mail: apalumbo@bnl.gov; Herman, M.; Capote, R.
The correct modeling of abundances requires knowledge of nuclear cross sections for a variety of neutron, charged particle and γ induced reactions. These involve targets far from stability and are therefore difficult (or currently impossible) to measure. Nuclear reaction theory provides the only way to estimate values of such cross sections. In this paper we present application of the EMPIRE reaction code to nuclear astrophysics. Recent measurements are compared to the calculated cross sections showing consistent agreement for n-, p- and α-induced reactions of strophysical relevance.
1986-10-01
BUZO, and FEDERICO KUHLMANN, Universidad Nacional Autdnoma de Mixico, Facultad de Ingenieria , Divisidn Estudios de Posgrado, P.O. Box 70-256, 04510...unsuccess- ful in this area for a long time. It was felt, e.g., in the voiceband modem industry , that the coding gains achievable by error-correction coding...without bandwidth expansion or data rate reduction, when compared to uncoded modulation. The concept was quickly adopted by industry , and is now becoming
Applications of Functional Analytic and Martingale Methods to Problems in Queueing Network Theory.
1983-05-14
8217’") Air Force Office of Scientific Research Sf. ADDRESS (Cllty. State and ZIP Code) 7b. ADDRESS (City. State and ZIP Code) Directorate of Mathematical... Scientific Report on Air Force Grant #82-0167 Principal Investigator: Professor Walter A. Rosenkrantz I. Publications (1) Calculation of the LaPlace transform...whether or not a protocol for accessing a comunications channel is stable. In AFOSR 82-0167, Report No. 3 we showed that the SLOTTED ALOHA Multi access
NASA Technical Reports Server (NTRS)
Hofmann, R.
1980-01-01
The STEALTH code system, which solves large strain, nonlinear continuum mechanics problems, was rigorously structured in both overall design and programming standards. The design is based on the theoretical elements of analysis while the programming standards attempt to establish a parallelism between physical theory, programming structure, and documentation. These features have made it easy to maintain, modify, and transport the codes. It has also guaranteed users a high level of quality control and quality assurance.
1990-01-01
intrinsic side information generated by an appropriate coding scheme . In this paper, we give sufficient conditions on channel classes for which a... zero redundancy case can be generalized to include the use of block channel coding of the permuted indices. An effective design method is introduced for...M. Naidjate, and C. R. P. Hartmann Boston University, College of Engineering, 44 Cummington Street, Boston, MA 02215 A generalization of the zero
1990-01-01
generator , and a set of three rules that insure the mimmum distance property. The coding scheme and the interleaving device will be described. The BER...point Padd method which matches the c.d f around zero and infinity. The advantage of this technique is that only low order moments are required. This is...accomplished by means of the intrinsic side information generated by an appropriate coding scheme . In this paper, we give sufficient conditions on channel
The journey from forensic to predictive materials science using density functional theory
Schultz, Peter A.
2017-09-12
Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.
The journey from forensic to predictive materials science using density functional theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz, Peter A.
Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.
Fast Model Generalized Pseudopotential Theory Interatomic Potential Routine
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-03-18
MGPT is an unclassified source code for the fast evaluation and application of quantum-based MGPT interatomic potentials for mrtals. The present version of MGPT has been developed entirely at LLNL, but is specifically designed for implementation in the open-source molecular0dynamics code LAMMPS maintained by Sandia National Laboratories. Using MGPT in LAMMPS, with separate input potential data, one can perform large-scale atomistic simulations of the structural, thermodynamic, defeat and mechanical properties of transition metals with quantum-mechanical realism.
Probabilistic Simulation for Nanocomposite Fracture
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2010-01-01
A unique probabilistic theory is described to predict the uniaxial strengths and fracture properties of nanocomposites. The simulation is based on composite micromechanics with progressive substructuring down to a nanoscale slice of a nanofiber where all the governing equations are formulated. These equations have been programmed in a computer code. That computer code is used to simulate uniaxial strengths and fracture of a nanofiber laminate. The results are presented graphically and discussed with respect to their practical significance. These results show smooth distributions from low probability to high.
Improved double-multiple streamtube model for the Darrieus-type vertical axis wind turbine
NASA Astrophysics Data System (ADS)
Berg, D. E.
Double streamtube codes model the curved blade (Darrieus-type) vertical axis wind turbine (VAWT) as a double actuator fish arrangement (one half) and use conservation of momentum principles to determine the forces acting on the turbine blades and the turbine performance. Sandia National Laboratories developed a double multiple streamtube model for the VAWT which incorporates the effects of the incident wind boundary layer, nonuniform velocity between the upwind and downwind sections of the rotor, dynamic stall effects and local blade Reynolds number variations. The theory underlying this VAWT model is described, as well as the code capabilities. Code results are compared with experimental data from two VAWT's and with the results from another double multiple streamtube and a vortex filament code. The effects of neglecting dynamic stall and horizontal wind velocity distribution are also illustrated.
A nonlocal electron conduction model for multidimensional radiation hydrodynamics codes
NASA Astrophysics Data System (ADS)
Schurtz, G. P.; Nicolaï, Ph. D.; Busquet, M.
2000-10-01
Numerical simulation of laser driven Inertial Confinement Fusion (ICF) related experiments require the use of large multidimensional hydro codes. Though these codes include detailed physics for numerous phenomena, they deal poorly with electron conduction, which is the leading energy transport mechanism of these systems. Electron heat flow is known, since the work of Luciani, Mora, and Virmont (LMV) [Phys. Rev. Lett. 51, 1664 (1983)], to be a nonlocal process, which the local Spitzer-Harm theory, even flux limited, is unable to account for. The present work aims at extending the original formula of LMV to two or three dimensions of space. This multidimensional extension leads to an equivalent transport equation suitable for easy implementation in a two-dimensional radiation-hydrodynamic code. Simulations are presented and compared to Fokker-Planck simulations in one and two dimensions of space.
The home care teaching and learning process in undergraduate health care degree courses.
Hermann, Ana Paula; Lacerda, Maria Ribeiro; Maftum, Mariluci Alves; Bernardino, Elizabeth; Mello, Ana Lúcia Schaefer Ferreira de
2017-07-01
Home care, one of the services provided by the health system, requires health practitioners who are capable of understanding its specificities. This study aimed to build a substantive theory that describes experiences of home care teaching and learning during undergraduate degree courses in nursing, pharmacy, medicine, nutrition, dentistry and occupational therapy. A qualitative analysis was performed using the grounded theory approach based on the results of 63 semistructured interviews conducted with final year students, professors who taught subjects related to home care, and recent graduates working with home care, all participants in the above courses. The data was analyzed in three stages - open coding, axial coding and selective coding - resulting in the phenomenon Experiences of home care teaching and learning during the undergraduate health care degree courses. Its causes were described in the category Articulating knowledge of home care, strategies in the category Experiencing the unique nature of home care, intervening conditions in the category Understanding the multidimensional characteristics of home care, consequences in the category Changing thinking about home care training, and context in the category Understanding home care in the health system. Home care contributes towards the decentralization of hospital care.
Novice to Expert Practice via Postprofessional Athletic Training Education: A Grounded Theory
Neibert, Peter J
2009-01-01
Objective: To discover the theoretic constructs that confirm, disconfirm, or extend the principles and their applications appropriate for National Athletic Trainers' Association (NATA)–accredited postprofessional athletic training education programs. Design: Interviews at the 2003 NATA Annual Meeting & Clinical Symposia. Setting: Qualitative study using grounded theory procedures. Patients and Other Participants: Thirteen interviews were conducted with postprofessional graduates. Participants were purposefully selected based on theoretic sampling and availability. Data Collection and Analysis: The transcribed interviews were analyzed using open coding, axial coding, and selective coding procedures. Member checks, reflective journaling, and triangulation were used to ensure trustworthiness. Results: The participants' comments confirmed and extended the current principles of postprofessional athletic training education programs and offered additional suggestions for more effective practical applications. Conclusions: The emergence of this central category of novice to expert practice is a paramount finding. The tightly woven fabric of the 10 processes, when interlaced with one another, provides a strong tapestry supporting novice to expert practice via postprofessional athletic training education. The emergence of this theoretic position pushes postprofessional graduate athletic training education forward to the future for further investigation into the theoretic constructs of novice to expert practice. PMID:19593420
The Development of the Ducted Fan Noise Propagation and Radiation Code CDUCT-LaRC
NASA Technical Reports Server (NTRS)
Nark, Douglas M.; Farassat, F.; Pope, D. Stuart; Vatsa, Veer
2003-01-01
The development of the ducted fan noise propagation and radiation code CDUCT-LaRC at NASA Langley Research Center is described. This code calculates the propagation and radiation of given acoustic modes ahead of the fan face or aft of the exhaust guide vanes in the inlet or exhaust ducts, respectively. This paper gives a description of the modules comprising CDUCT-LaRC. The grid generation module provides automatic creation of numerical grids for complex (non-axisymmetric) geometries that include single or multiple pylons. Files for performing automatic inviscid mean flow calculations are also generated within this module. The duct propagation is based on the parabolic approximation theory of R. P. Dougherty. This theory allows the handling of complex internal geometries and the ability to study the effect of non-uniform (i.e. circumferentially and axially segmented) liners. Finally, the duct radiation module is based on the Ffowcs Williams-Hawkings (FW-H) equation with a penetrable data surface. Refraction of sound through the shear layer between the external flow and bypass duct flow is included. Results for benchmark annular ducts, as well as other geometries with pylons, are presented and compared with available analytical data.
A general panel sizing computer code and its application to composite structural panels
NASA Technical Reports Server (NTRS)
Anderson, M. S.; Stroud, W. J.
1978-01-01
A computer code for obtaining the dimensions of optimum (least mass) stiffened composite structural panels is described. The procedure, which is based on nonlinear mathematical programming and a rigorous buckling analysis, is applicable to general cross sections under general loading conditions causing buckling. A simplified method of accounting for bow-type imperfections is also included. Design studies in the form of structural efficiency charts for axial compression loading are made with the code for blade and hat stiffened panels. The effects on panel mass of imperfections, material strength limitations, and panel stiffness requirements are also examined. Comparisons with previously published experimental data show that accounting for imperfections improves correlation between theory and experiment.
An update on the BQCD Hybrid Monte Carlo program
NASA Astrophysics Data System (ADS)
Haar, Taylor Ryan; Nakamura, Yoshifumi; Stüben, Hinnerk
2018-03-01
We present an update of BQCD, our Hybrid Monte Carlo program for simulating lattice QCD. BQCD is one of the main production codes of the QCDSF collaboration and is used by CSSM and in some Japanese finite temperature and finite density projects. Since the first publication of the code at Lattice 2010 the program has been extended in various ways. New features of the code include: dynamical QED, action modification in order to compute matrix elements by using Feynman-Hellman theory, more trace measurements (like Tr(D-n) for K, cSW and chemical potential reweighting), a more flexible integration scheme, polynomial filtering, term-splitting for RHMC, and a portable implementation of performance critical parts employing SIMD.
1981-01-01
Reference Direction4 at " Is - (198) SNetwork’Ports. In either c•es, the port voltagemay be related to the appl &id field on the "segment by’ t~h constant...04 6.|• swot -0 1, i.61-03 45.766 17 0 0.117* 0.US30 ,0001 0.01111,31 1 I. K-03 1.137ft-04 i .3%$K-03 11.i1i is 0 0a1113 0.2178 0.0003 0.00339 1.1117K
Enrichment of Circular Code Motifs in the Genes of the Yeast Saccharomyces cerevisiae.
Michel, Christian J; Ngoune, Viviane Nguefack; Poch, Olivier; Ripp, Raymond; Thompson, Julie D
2017-12-03
A set X of 20 trinucleotides has been found to have the highest average occurrence in the reading frame, compared to the two shifted frames, of genes of bacteria, archaea, eukaryotes, plasmids and viruses. This set X has an interesting mathematical property, since X is a maximal C3 self-complementary trinucleotide circular code. Furthermore, any motif obtained from this circular code X has the capacity to retrieve, maintain and synchronize the original (reading) frame. Since 1996, the theory of circular codes in genes has mainly been developed by analysing the properties of the 20 trinucleotides of X, using combinatorics and statistical approaches. For the first time, we test this theory by analysing the X motifs, i.e., motifs from the circular code X, in the complete genome of the yeast Saccharomyces cerevisiae . Several properties of X motifs are identified by basic statistics (at the frequency level), and evaluated by comparison to R motifs, i.e., random motifs generated from 30 different random codes R. We first show that the frequency of X motifs is significantly greater than that of R motifs in the genome of S. cerevisiae . We then verify that no significant difference is observed between the frequencies of X and R motifs in the non-coding regions of S. cerevisiae , but that the occurrence number of X motifs is significantly higher than R motifs in the genes (protein-coding regions). This property is true for all cardinalities of X motifs (from 4 to 20) and for all 16 chromosomes. We further investigate the distribution of X motifs in the three frames of S. cerevisiae genes and show that they occur more frequently in the reading frame, regardless of their cardinality or their length. Finally, the ratio of X genes, i.e., genes with at least one X motif, to non-X genes, in the set of verified genes is significantly different to that observed in the set of putative or dubious genes with no experimental evidence. These results, taken together, represent the first evidence for a significant enrichment of X motifs in the genes of an extant organism. They raise two hypotheses: the X motifs may be evolutionary relics of the primitive codes used for translation, or they may continue to play a functional role in the complex processes of genome decoding and protein synthesis.
NASA Astrophysics Data System (ADS)
Kesler, Steven R.
The lifting line theory was first developed by Prandtl and was used primarily on analysis of airplane wings. Though the theory is about one hundred years old, it is still used in the initial calculations to find the lift of a wing. The question that guided this thesis was, "How close does Prandtl's lifting line theory predict the thrust of a propeller?" In order to answer this question, an experiment was designed that measured the thrust of a propeller for different speeds. The measured thrust was compared to what the theory predicted. In order to do this experiment and analysis, a propeller needed to be used. A walnut wood ultralight propeller was chosen that had a 1.30 meter (51 inches) length from tip to tip. In this thesis, Prandtl's lifting line theory was modified to account for the different incoming velocity depending on the radial position of the airfoil. A modified equation was used to reflect these differences. A working code was developed based on this modified equation. A testing rig was built that allowed the propeller to be rotated at high speeds while measuring the thrust. During testing, the rotational speed of the propeller ranged from 13-43 rotations per second. The thrust from the propeller was measured at different speeds and ranged from 16-33 Newton's. The test data were then compared to the theoretical results obtained from the lifting line code. A plot in Chapter 5 (the results section) shows the theoretical vs. actual thrust for different rotational speeds. The theory over predicted the actual thrust of the propeller. Depending on the rotational speed, the error was: at low speeds 36%, at low to moderate speeds 84%, and at high speeds the error increased to 195%. Different reasons for these errors are discussed.
Agrell, U
1986-07-01
A general stochastic theory of cancer is outlined by applying to cancer the laws of quantum mechanics instead of the laws of traditional physics, especially with regard to the concept of cause. This theory is combined with the evolutionary theory on the one hand and the mutation theory of aging/death of multicellular beings consisting of somatic cells on the other. The cancer theory centers around the phenomenon of DNA mutating randomly by quantal steps. Because of mutations in the DNA in general as well as in the special DNA which codes for the DNA repairing systems the body is permeated in the course of time - via increasing losses of information in the DNA - with increasingly altered proteins which is observed as aging process. From this process of entropy the concept of the cancer cell is deduced: When the losses of information in a certain cell and also in the repairing and immunological systems have random concordances, cancer as a type of antigens comes into existence. Here the concept of CONCORDANCE OF "BLURRING" is introduced. This CONCORDANCE OF "BLURRING" occurs randomly approximately once among three times 60 000 billion cells, i.e. three human beings. The so-called "oncogenes" are integrated into this theory. It is proposed to test this theory using monozygotic twins both suffering from cancer: By injecting monoclonally multiplied immunological systems, eventually also repair-systems, from the respective other twin, the proposition is that the cancer would be cured in both twins. If this critical experiment is successful, one can cure human beings suffering from cancer by the same procedure, using those systems of their relatives. This treatment would cure the cancer to the extent to which there is a genetic correspondence in the sections of genes coding for these systems.
Bai, Jinbing; Swanson, Kristen M; Santacroce, Sheila J
2018-01-01
Parent interactions with their child can influence the child's pain and distress during painful procedures. Reliable and valid interaction analysis systems (IASs) are valuable tools for capturing these interactions. The extent to which IASs are used in observational research of parent-child interactions is unknown in pediatric populations. To identify and evaluate studies that focus on assessing psychometric properties of initial iterations/publications of observational coding systems of parent-child interactions during painful procedures. To identify and evaluate studies that focus on assessing psychometric properties of initial iterations/publications of observational coding systems of parent-child interactions during painful procedures. Computerized databases searched included PubMed, CINAHL, PsycINFO, Health and Psychosocial Instruments, and Scopus. Timeframes covered from inception of the database to January 2017. Studies were included if they reported use or psychometrics of parent-child IASs. First assessment was whether the parent-child IASs were theory-based; next, using the Society of Pediatric Psychology Assessment Task Force criteria IASs were assigned to one of three categories: well-established, approaching well-established, or promising. A total of 795 studies were identified through computerized searches. Eighteen studies were ultimately determined to be eligible for inclusion in the review and 17 parent-child IASs were identified from these 18 studies. Among the 17 coding systems, 14 were suitable for use in children age 3 years or more; two were theory-based; and 11 included verbal and nonverbal parent behaviors that promoted either child coping or child distress. Four IASs were assessed as well-established; seven approached well-established; and six were promising. Findings indicate a need for the development of theory-based parent-child IASs that consider both verbal and nonverbal parent behaviors during painful procedures. Findings also suggest a need for further testing of those parent-child IASs deemed "approaching well-established" or "promising". © 2017 World Institute of Pain.
NASA Astrophysics Data System (ADS)
Messitt, Donald G.
1999-11-01
The WIND code was employed to compute the hypersonic flow in the shock wave boundary layer merged region near the leading edge of a sharp flat plate. Solutions were obtained at Mach numbers from 9.86 to 15.0 and free stream Reynolds numbers of 3,467 to 346,700 in-1 (1.365 · 105 to 1.365 · 107 m-1) for perfect gas conditions. The numerical results indicated a merged shock wave and viscous layer near the leading edge. The merged region grew in size with increasing free stream Mach number, proportional to Minfinity 2/Reinfinity. Profiles of the static pressure in the merged region indicated a strong normal pressure gradient (∂p/∂y). The normal pressure gradient has been neglected in previous analyses which used the boundary layer equations. The shock wave near the leading edge was thick, as has been experimentally observed. Computed shock wave locations and surface pressures agreed well within experimental error for values of the rarefaction parameter, chi/M infinity2 < 0.3. A preliminary analysis using kinetic theory indicated that rarefied flow effects became important above this value. In particular, the WIND solution agreed well in the transition region between the merged flow, which was predicted well by the theory of Li and Nagamatsu, and the downstream region where the strong interaction theory applied. Additional computations with the NPARC code, WIND's predecessor, demonstrated the ability of the code to compute hypersonic inlet flows at free stream Mach numbers up to 20. Good qualitative agreement with measured pressure data indicated that the code captured the important physical features of the shock wave - boundary layer interactions. The computed surface and pitot pressures fell within the combined experimental and numerical error bounds for most points. The calculations demonstrated the need for extremely fine grids when computing hypersonic interaction flows.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brower, Richard C.
This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less
Solution of the lossy nonlinear Tricomi equation with application to sonic boom focusing
NASA Astrophysics Data System (ADS)
Salamone, Joseph A., III
Sonic boom focusing theory has been augmented with new terms that account for mean flow effects in the direction of propagation and also for atmospheric absorption/dispersion due to molecular relaxation due to oxygen and nitrogen. The newly derived model equation was numerically implemented using a computer code. The computer code was numerically validated using a spectral solution for nonlinear propagation of a sinusoid through a lossy homogeneous medium. An additional numerical check was performed to verify the linear diffraction component of the code calculations. The computer code was experimentally validated using measured sonic boom focusing data from the NASA sponsored Superboom Caustic and Analysis Measurement Program (SCAMP) flight test. The computer code was in good agreement with both the numerical and experimental validation. The newly developed code was applied to examine the focusing of a NASA low-boom demonstration vehicle concept. The resulting pressure field was calculated for several supersonic climb profiles. The shaping efforts designed into the signatures were still somewhat evident despite the effects of sonic boom focusing.
NASA Astrophysics Data System (ADS)
Menthe, R. W.; McColgan, C. J.; Ladden, R. M.
1991-05-01
The Unified AeroAcoustic Program (UAAP) code calculates the airloads on a single rotation prop-fan, or propeller, and couples these airloads with an acoustic radiation theory, to provide estimates of near-field or far-field noise levels. The steady airloads can also be used to calculate the nonuniform velocity components in the propeller wake. The airloads are calculated using a three dimensional compressible panel method which considers the effects of thin, cambered, multiple blades which may be highly swept. These airloads may be either steady or unsteady. The acoustic model uses the blade thickness distribution and the steady or unsteady aerodynamic loads to calculate the acoustic radiation. The users manual for the UAAP code is divided into five sections: general code description; input description; output description; system description; and error codes. The user must have access to IMSL10 libraries (MATH and SFUN) for numerous calls made for Bessel functions and matrix inversion. For plotted output users must modify the dummy calls to plotting routines included in the code to system-specific calls appropriate to the user's installation.