Science.gov

Sample records for investigators independently coded

  1. Correlation and independence in the neural code.

    PubMed

    Amari, Shun-ichi; Nakahara, Hiroyuki

    2006-06-01

    The decoding scheme of a stimulus can be different from the stochastic encoding scheme in the neural population coding. The stochastic fluctuations are not independent in general, but an independent version could be used for the ease of decoding. How much information is lost by using this unfaithful model for decoding? There are discussions concerning loss of information (Nirenberg & Latham, 2003; Schneidman, Bialek, & Berry, 2003). We elucidate the Nirenberg-Latham loss from the point of view of information geometry. PMID:16764504

  2. Evidence for modality-independent order coding in working memory.

    PubMed

    Depoorter, Ann; Vandierendonck, André

    2009-03-01

    The aim of the present study was to investigate the representation of serial order in working memory, more specifically whether serial order is coded by means of a modality-dependent or a modality-independent order code. This was investigated by means of a series of four experiments based on a dual-task methodology in which one short-term memory task was embedded between the presentation and recall of another short-term memory task. Two aspects were varied in these memory tasks--namely, the modality of the stimulus materials (verbal or visuo-spatial) and the presence of an order component in the task (an order or an item memory task). The results of this study showed impaired primary-task recognition performance when both the primary and the embedded task included an order component, irrespective of the modality of the stimulus materials. If one or both of the tasks did not contain an order component, less interference was found. The results of this study support the existence of a modality-independent order code. PMID:18609385

  3. The design of relatively machine-independent code generators

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.

    1979-01-01

    Two complementary approaches were investigated. In the first approach software design techniques were used to design the structure of a code generator for Halmat. The major result was the development of an intermediate code form known as 7UP. The second approach viewed the problem as one in providing a tool to the code generator programmer. The major result was the development of a non-procedural, problem oriented language known as CGGL (Code Generator Generator Language).

  4. Independent rate and temporal coding in hippocampal pyramidal cells

    PubMed Central

    Huxter, John; Burgess, Neil; O’Keefe, John

    2009-01-01

    Hippocampal pyramidal cells use temporal 1 as well as rate coding 2 to signal spatial aspects of the animal’s environment or behaviour. The temporal code takes the form of a phase relationship to the concurrent cycle of the hippocampal EEG theta rhythm (Figure 1​; 1). These two codes could each represent a different variable 3,4. However, this requires that rate and phase can vary independently, in contrast to recent suggestions 5,6 that they are tightly coupled: both reflecting the amplitude of the cell’s input. Here we show that the time of firing and firing rate are dissociable and can represent two independent variables, viz, the animal’s location within the place field and its speed of movement through the field, respectively. Independent encoding of location together with actions and stimuli occurring there may help to explain the dual roles of the hippocampus in spatial and episodic memory 7 8 or a more general role in relational/declarative memory9,10. PMID:14574410

  5. Benchmark testing and independent verification of the VS2DT computer code

    SciTech Connect

    McCord, J.T.; Goodrich, M.T.

    1994-11-01

    The finite difference flow and transport simulator VS2DT was benchmark tested against several other codes which solve the same equations (Richards equation for flow and the Advection-Dispersion equation for transport). The benchmark problems investigated transient two-dimensional flow in a heterogeneous soil profile with a localized water source at the ground surface. The VS2DT code performed as well as or better than all other codes when considering mass balance characteristics and computational speed. It was also rated highly relative to the other codes with regard to ease-of-use. Following the benchmark study, the code was verified against two analytical solutions, one for two-dimensional flow and one for two-dimensional transport. These independent verifications show reasonable agreement with the analytical solutions, and complement the one-dimensional verification problems published in the code`s original documentation.

  6. Investigation of Near Shannon Limit Coding Schemes

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Kim, J.; Mo, Fan

    1999-01-01

    Turbo codes can deliver performance that is very close to the Shannon limit. This report investigates algorithms for convolutional turbo codes and block turbo codes. Both coding schemes can achieve performance near Shannon limit. The performance of the schemes is obtained using computer simulations. There are three sections in this report. First section is the introduction. The fundamental knowledge about coding, block coding and convolutional coding is discussed. In the second section, the basic concepts of convolutional turbo codes are introduced and the performance of turbo codes, especially high rate turbo codes, is provided from the simulation results. After introducing all the parameters that help turbo codes achieve such a good performance, it is concluded that output weight distribution should be the main consideration in designing turbo codes. Based on the output weight distribution, the performance bounds for turbo codes are given. Then, the relationships between the output weight distribution and the factors like generator polynomial, interleaver and puncturing pattern are examined. The criterion for the best selection of system components is provided. The puncturing pattern algorithm is discussed in detail. Different puncturing patterns are compared for each high rate. For most of the high rate codes, the puncturing pattern does not show any significant effect on the code performance if pseudo - random interleaver is used in the system. For some special rate codes with poor performance, an alternative puncturing algorithm is designed which restores their performance close to the Shannon limit. Finally, in section three, for iterative decoding of block codes, the method of building trellis for block codes, the structure of the iterative decoding system and the calculation of extrinsic values are discussed.

  7. Quantum image coding with a reference-frame-independent scheme

    NASA Astrophysics Data System (ADS)

    Chapeau-Blondeau, François; Belin, Etienne

    2016-07-01

    For binary images, or bit planes of non-binary images, we investigate the possibility of a quantum coding decodable by a receiver in the absence of reference frames shared with the emitter. Direct image coding with one qubit per pixel and non-aligned frames leads to decoding errors equivalent to a quantum bit-flip noise increasing with the misalignment. We show the feasibility of frame-invariant coding by using for each pixel a qubit pair prepared in one of two controlled entangled states. With just one common axis shared between the emitter and receiver, exact decoding for each pixel can be obtained by means of two two-outcome projective measurements operating separately on each qubit of the pair. With strictly no alignment information between the emitter and receiver, exact decoding can be obtained by means of a two-outcome projective measurement operating jointly on the qubit pair. In addition, the frame-invariant coding is shown much more resistant to quantum bit-flip noise compared to the direct non-invariant coding. For a cost per pixel of two (entangled) qubits instead of one, complete frame-invariant image coding and enhanced noise resistance are thus obtained.

  8. Quantum image coding with a reference-frame-independent scheme

    NASA Astrophysics Data System (ADS)

    Chapeau-Blondeau, François; Belin, Etienne

    2016-04-01

    For binary images, or bit planes of non-binary images, we investigate the possibility of a quantum coding decodable by a receiver in the absence of reference frames shared with the emitter. Direct image coding with one qubit per pixel and non-aligned frames leads to decoding errors equivalent to a quantum bit-flip noise increasing with the misalignment. We show the feasibility of frame-invariant coding by using for each pixel a qubit pair prepared in one of two controlled entangled states. With just one common axis shared between the emitter and receiver, exact decoding for each pixel can be obtained by means of two two-outcome projective measurements operating separately on each qubit of the pair. With strictly no alignment information between the emitter and receiver, exact decoding can be obtained by means of a two-outcome projective measurement operating jointly on the qubit pair. In addition, the frame-invariant coding is shown much more resistant to quantum bit-flip noise compared to the direct non-invariant coding. For a cost per pixel of two (entangled) qubits instead of one, complete frame-invariant image coding and enhanced noise resistance are thus obtained.

  9. Independent accident investigation: a modern safety tool.

    PubMed

    Stoop, John A

    2004-07-26

    Historically, safety has been subjected to a fragmented approach. In the past, every department has had its own responsibility towards safety, focusing either on working conditions, internal safety, external safety, rescue and emergency, public order or security. They each issued policy documents, which in their time were leading statements for elaboration and regulation. They also addressed safety issues with tools of various nature, often specifically developed within their domain. Due to a series of major accidents and disasters, the focus of attention is shifting from complying with quantitative risk standards towards intervention in primary operational processes, coping with systemic deficiencies and a more integrated assessment of safety in its societal context. In The Netherlands recognition of the importance of independent investigations has led to an expansion of this philosophy from the transport sector to other sectors. The philosophy now covers transport, industry, defense, natural disaster, environment and health and other major occurrences such as explosions, fires, and collapse of buildings or structures. In 2003 a multi-sector covering law will establish an independent safety board in The Netherlands. At a European level, mandatory investigation agencies are recognized as indispensable safety instruments for aviation, railways and the maritime sector, for which EU Directives are in place or being progressed [Transport accident and incident investigation in the European Union, European Transport Safety Council, ISBN 90-76024-10-3, Brussel, 2001]. Due to a series of major events, attention has been drawn to the consequences of disasters, highlighting the involvement of rescue and emergency services. They also have become subjected to investigative efforts, which in return, puts demands on investigation methodology. This paper comments on an evolutionary development in safety thinking and of safety boards, highlighting some consequences for strategic

  10. Independent Population Coding of Speech with Sub-Millisecond Precision

    PubMed Central

    Garcia-Lazaro, Jose A.; Belliveau, Lucile A. C.

    2013-01-01

    To understand the strategies used by the brain to analyze complex environments, we must first characterize how the features of sensory stimuli are encoded in the spiking of neuronal populations. Characterizing a population code requires identifying the temporal precision of spiking and the extent to which spiking is correlated, both between cells and over time. In this study, we characterize the population code for speech in the gerbil inferior colliculus (IC), the hub of the auditory system where inputs from parallel brainstem pathways are integrated for transmission to the cortex. We find that IC spike trains can carry information about speech with sub-millisecond precision, and, consequently, that the temporal correlations imposed by refractoriness can play a significant role in shaping spike patterns. We also find that, in contrast to most other brain areas, the noise correlations between IC cells are extremely weak, indicating that spiking in the population is conditionally independent. These results demonstrate that the problem of understanding the population coding of speech can be reduced to the problem of understanding the stimulus-driven spiking of individual cells, suggesting that a comprehensive model of the subcortical processing of speech may be attainable in the near future. PMID:24305831

  11. Investigating the Simulink Auto-Coding Process

    NASA Technical Reports Server (NTRS)

    Gualdoni, Matthew J.

    2016-01-01

    the program; additionally, this is lost time that could be spent testing and analyzing the code. This is one of the more prominent issues with the auto-coding process, and while much information is available with regard to optimizing Simulink designs to produce efficient and reliable C++ code, not much research has been made public on how to reduce the code generation time. It is of interest to develop some insight as to what causes code generation times to be so significant, and determine if there are architecture guidelines or a desirable auto-coding configuration set to assist in streamlining this step of the design process for particular applications. To address the issue at hand, the Simulink coder was studied at a foundational level. For each different component type made available by the software, the features, auto-code generation time, and the format of the generated code were analyzed and documented. Tools were developed and documented to expedite these studies, particularly in the area of automating sequential builds to ensure accurate data was obtained. Next, the Ramses model was examined in an attempt to determine the composition and the types of technologies used in the model. This enabled the development of a model that uses similar technologies, but takes a fraction of the time to auto-code to reduce the turnaround time for experimentation. Lastly, the model was used to run a wide array of experiments and collect data to obtain knowledge about where to search for bottlenecks in the Ramses model. The resulting contributions of the overall effort consist of an experimental model for further investigation into the subject, as well as several automation tools to assist in analyzing the model, and a reference document offering insight to the auto-coding process, including documentation of the tools used in the model analysis, data illustrating some potential problem areas in the auto-coding process, and recommendations on areas or practices in the current

  12. Implementation of context independent code on a new array processor: The Super-65

    NASA Technical Reports Server (NTRS)

    Colbert, R. O.; Bowhill, S. A.

    1981-01-01

    The feasibility of rewriting standard uniprocessor programs into code which contains no context-dependent branches is explored. Context independent code (CIC) would contain no branches that might require different processing elements to branch different ways. In order to investigate the possibilities and restrictions of CIC, several programs were recoded into CIC and a four-element array processor was built. This processor (the Super-65) consisted of three 6502 microprocessors and the Apple II microcomputer. The results obtained were somewhat dependent upon the specific architecture of the Super-65 but within bounds, the throughput of the array processor was found to increase linearly with the number of processing elements (PEs). The slope of throughput versus PEs is highly dependent on the program and varied from 0.33 to 1.00 for the sample programs.

  13. Results of investigation of adaptive speech codes

    NASA Astrophysics Data System (ADS)

    Nekhayev, A. L.; Pertseva, V. A.; Sitnyakovskiy, I. V.

    1984-06-01

    A search for ways of increasing the effectiveness of speech signals in digital form lead to the appearance of various methods of encoding, to reduce the excessiveness of specific properties of the speech signal. It is customary to divide speech codes into two large classes: codes of signal parameters (or vocoders), and codes of the signal form (CSF. In telephony, preference is given to a second class of systems, which maintains naturalness of sound. The class of CSF expanded considerably because of the development of codes based on the frequency representation of a signal. The greatest interest is given to such methods of encoding as pulse modulation (PCM), differential PCM (DPCM), and delta modulation (DM). However, developers of digital transmission systems find it difficult to compile a complete pattern of the applicability of specific types of codes. The best known versions of the codes are evaluated by means of subjective-statistical measurements of their characteristics. The results obtained help developers to draw conclusions regarding the applicability of the codes considered in various communication systems.

  14. Two independent transcription initiation codes overlap on vertebrate core promoters

    NASA Astrophysics Data System (ADS)

    Haberle, Vanja; Li, Nan; Hadzhiev, Yavor; Plessy, Charles; Previti, Christopher; Nepal, Chirag; Gehrig, Jochen; Dong, Xianjun; Akalin, Altuna; Suzuki, Ana Maria; van Ijcken, Wilfred F. J.; Armant, Olivier; Ferg, Marco; Strähle, Uwe; Carninci, Piero; Müller, Ferenc; Lenhard, Boris

    2014-03-01

    A core promoter is a stretch of DNA surrounding the transcription start site (TSS) that integrates regulatory inputs and recruits general transcription factors to initiate transcription. The nature and causative relationship of the DNA sequence and chromatin signals that govern the selection of most TSSs by RNA polymerase II remain unresolved. Maternal to zygotic transition represents the most marked change of the transcriptome repertoire in the vertebrate life cycle. Early embryonic development in zebrafish is characterized by a series of transcriptionally silent cell cycles regulated by inherited maternal gene products: zygotic genome activation commences at the tenth cell cycle, marking the mid-blastula transition. This transition provides a unique opportunity to study the rules of TSS selection and the hierarchy of events linking transcription initiation with key chromatin modifications. We analysed TSS usage during zebrafish early embryonic development at high resolution using cap analysis of gene expression, and determined the positions of H3K4me3-marked promoter-associated nucleosomes. Here we show that the transition from the maternal to zygotic transcriptome is characterized by a switch between two fundamentally different modes of defining transcription initiation, which drive the dynamic change of TSS usage and promoter shape. A maternal-specific TSS selection, which requires an A/T-rich (W-box) motif, is replaced with a zygotic TSS selection grammar characterized by broader patterns of dinucleotide enrichments, precisely aligned with the first downstream (+1) nucleosome. The developmental dynamics of the H3K4me3-marked nucleosomes reveal their DNA-sequence-associated positioning at promoters before zygotic transcription and subsequent transcription-independent adjustment to the final position downstream of the zygotic TSS. The two TSS-defining grammars coexist, often physically overlapping, in core promoters of constitutively expressed genes to enable

  15. Pcigale: Porting Code Investigating Galaxy Emission to Python

    NASA Astrophysics Data System (ADS)

    Roehlly, Y.; Burgarella, D.; Buat, V.; Boquien, M.; Ciesla, L.; Heinis, S.

    2014-05-01

    We present pcigale, the port to Python of CIGALE (Code Investigating Galaxy Emission) a Fortran spectral energy distribution (SED) fitting code developed at the Laboratoire d'Astrophysique de Marseille. After recalling the specifics of the SED fitting method, we show the gains in modularity and versatility offered by Python, as well as the drawbacks compared to the compiled code.

  16. A coding-independent function of gene and pseudogene mRNAs regulates tumour biology

    PubMed Central

    Poliseno, Laura; Salmena, Leonardo; Zhang, Jiangwen; Carver, Brett; Haveman, William J.; Pandolfi, Pier Paolo

    2011-01-01

    The canonical role of messenger RNA (mRNA) is to deliver protein-coding information to sites of protein synthesis. However, given that microRNAs bind to RNAs, we hypothesized that RNAs possess a biological role in cancer cells that relies upon their ability to compete for microRNA binding and is independent of their protein-coding function. As a paradigm for the protein-coding-independent role of RNAs, we describe the functional relationship between the mRNAs produced by the PTEN tumour suppressor gene and its pseudogene (PTENP1) and the critical consequences of this interaction. We find that PTENP1 is biologically active as determined by its ability to regulate cellular levels of PTEN, and that it can exert a growth-suppressive role. We also show that PTENP1 locus is selectively lost in human cancer. We extend our analysis to other cancer-related genes that possess pseudogenes, such as oncogenic KRAS. Further, we demonstrate that the transcripts of protein coding genes such as PTEN are also biologically active. Together, these findings attribute a novel biological role to expressed pseudogenes, as they can regulate coding gene expression, and reveal a non-coding function for mRNAs. PMID:20577206

  17. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1993-01-01

    The first year's effort on NASA Grant NAG5-2006 was an investigation to characterize typical errors resulting from the EOS dorn link. The analysis methods developed for this effort were used on test data from a March 1992 White Sands Terminal Test. The effectiveness of a concatenated coding scheme of a Reed Solomon outer code and a convolutional inner code versus a Reed Solomon only code scheme has been investigated as well as the effectiveness of a Periodic Convolutional Interleaver in dispersing errors of certain types. The work effort consisted of development of software that allows simulation studies with the appropriate coding schemes plus either simulated data with errors or actual data with errors. The software program is entitled Communication Link Error Analysis (CLEAN) and models downlink errors, forward error correcting schemes, and interleavers.

  18. Independent verification and validation testing of the FLASH computer code, Versiion 3.0

    SciTech Connect

    Martian, P.; Chung, J.N.

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  19. Independent verification and validation testing of the FLASH computer code, Versiion 3. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-06-01

    Independent testing of the FLASH computer code, Version 3.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at various Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Verification tests, and validation tests, were used to determine the operational status of the FLASH computer code. These tests were specifically designed to test: correctness of the FORTRAN coding, computational accuracy, and suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: blind testing, independent applications, and graduated difficulty of test cases. Both quantitative and qualitative testing was performed through evaluating relative root mean square values and graphical comparisons of the numerical, analytical, and experimental data. Four verification test were used to check the computational accuracy and correctness of the FORTRAN coding, and three validation tests were used to check the suitability to simulating actual conditions. These tests cases ranged in complexity from simple 1-D saturated flow to 2-D variably saturated problems. The verification tests showed excellent quantitative agreement between the FLASH results and analytical solutions. The validation tests showed good qualitative agreement with the experimental data. Based on the results of this testing, it was concluded that the FLASH code is a versatile and powerful two-dimensional analysis tool for fluid flow. In conclusion, all aspects of the code that were tested, except for the unit gradient bottom boundary condition, were found to be fully operational and ready for use in hydrological and environmental studies.

  20. Category-dependent and category-independent goal-value codes in human ventromedial prefrontal cortex.

    PubMed

    McNamee, Daniel; Rangel, Antonio; O'Doherty, John P

    2013-04-01

    To choose between manifestly distinct options, it is suggested that the brain assigns values to goals using a common currency. Although previous studies have reported activity in ventromedial prefrontal cortex (vmPFC) correlating with the value of different goal stimuli, it remains unclear whether such goal-value representations are independent of the associated stimulus categorization, as required by a common currency. Using multivoxel pattern analyses on functional magnetic resonance imaging (fMRI) data, we found a region of medial prefrontal cortex to contain a distributed goal-value code that is independent of stimulus category. More ventrally in the vmPFC, we found spatially distinct areas of the medial orbitofrontal cortex to contain unique category-dependent distributed value codes for food and consumer items. These results implicate the medial prefrontal cortex in the implementation of a common currency and suggest a ventral versus dorsal topographical organization of value signals in the vmPFC. PMID:23416449

  1. Independent validation testing of the FLAME computer code, Version 1.0

    SciTech Connect

    Martian, P.; Chung, J.N.

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions.

  2. Independent validation testing of the FLAME computer code, Version 1. 0

    SciTech Connect

    Martian, P.; Chung, J.N. . Dept. of Mechanical and Materials Engineering)

    1992-07-01

    Independent testing of the FLAME computer code, Version 1.0, was conducted to determine if the code is ready for use in hydrological and environmental studies at Department of Energy sites. This report describes the technical basis, approach, and results of this testing. Validation tests, (i.e., tests which compare field data to the computer generated solutions) were used to determine the operational status of the FLAME computer code and were done on a qualitative basis through graphical comparisons of the experimental and numerical data. These tests were specifically designed to check: (1) correctness of the FORTRAN coding, (2) computational accuracy, and (3) suitability to simulating actual hydrologic conditions. This testing was performed using a structured evaluation protocol which consisted of: (1) independent applications, and (2) graduated difficulty of test cases. Three tests ranging in complexity from simple one-dimensional steady-state flow field problems under near-saturated conditions to two-dimensional transient flow problems with very dry initial conditions.

  3. Error tolerance of topological codes with independent bit-flip and measurement errors

    NASA Astrophysics Data System (ADS)

    Andrist, Ruben S.; Katzgraber, Helmut G.; Bombin, H.; Martin-Delgado, M. A.

    2016-07-01

    Topological quantum error correction codes are currently among the most promising candidates for efficiently dealing with the decoherence effects inherently present in quantum devices. Numerically, their theoretical error threshold can be calculated by mapping the underlying quantum problem to a related classical statistical-mechanical spin system with quenched disorder. Here, we present results for the general fault-tolerant regime, where we consider both qubit and measurement errors. However, unlike in previous studies, here we vary the strength of the different error sources independently. Our results highlight peculiar differences between toric and color codes. This study complements previous results published in New J. Phys. 13, 083006 (2011), 10.1088/1367-2630/13/8/083006.

  4. Proof-of-principle experiment of reference-frame-independent quantum key distribution with phase coding

    PubMed Central

    Liang, Wen-Ye; Wang, Shuang; Li, Hong-Wei; Yin, Zhen-Qiang; Chen, Wei; Yao, Yao; Huang, Jing-Zheng; Guo, Guang-Can; Han, Zheng-Fu

    2014-01-01

    We have demonstrated a proof-of-principle experiment of reference-frame-independent phase coding quantum key distribution (RFI-QKD) over an 80-km optical fiber. After considering the finite-key bound, we still achieve a distance of 50 km. In this scenario, the phases of the basis states are related by a slowly time-varying transformation. Furthermore, we developed and realized a new decoy state method for RFI-QKD systems with weak coherent sources to counteract the photon-number-splitting attack. With the help of a reference-frame-independent protocol and a Michelson interferometer with Faraday rotator mirrors, our system is rendered immune to the slow phase changes of the interferometer and the polarization disturbances of the channel, making the procedure very robust. PMID:24402550

  5. Streptococcus salivarius ATCC 25975 possesses at least two genes coding for primer-independent glucosyltransferases.

    PubMed Central

    Simpson, C L; Giffard, P M; Jacques, N A

    1995-01-01

    Fractionation of the culture medium showed that Streptococcus salivarius ATCC 25975 secreted a glucosyltransferase (Gtf) that was primer independent. On the basis of this observation, a gene library of S. salivarius chromosomal DNA cloned into lambda L47.1 was screened for a gene(s) coding for such an activity. As a result of this screening process, two new gtf genes, gtfL and gtfM, both of which coded for primer-independent Gtf activities, were isolated. GtfL produced an insoluble glucan that was refractory to digestion by the endo-(1-->6)-alpha-D-glucanase. of Chaetonium gracile, while GtfM produced a soluble glucan that was readily degraded by the glucanase. Comparison of the deduced amino acid sequences of gtfL and gtfM with 10 other available Gtf sequences allowed the relatedness of the conserved catalytic regions to be assessed. This analysis showed that the 12 enzymes did not form clusters based on their primer dependencies or on their product solubilities. Further analysis of the YG repeats in the C-terminal glucan-binding domains of GtfJ, GtfK, GtfL, and GtfM from S. salivarius showed that there was strong homology between a block of contiguous triplet YG repeats present in the four alleles. These blocks of YG repeats were coded for by a region of each gene that appeared to have arisen as a result of a recent duplication event(s). PMID:7822030

  6. Board Governance of Independent Schools: A Framework for Investigation

    ERIC Educational Resources Information Center

    McCormick, John; Barnett, Kerry; Alavi, Seyyed Babak; Newcombe, Geoffrey

    2006-01-01

    Purpose: This paper develops a theoretical framework to guide future inquiry into board governance of independent schools. Design/methodology/approach: The authors' approach is to integrate literatures related to corporate and educational boards, motivation, leadership and group processes that are appropriate for conceptualizing independent school…

  7. RELAP5/MOD3 code manual: Summaries and reviews of independent code assessment reports. Volume 7, Revision 1

    SciTech Connect

    Moore, R.L.; Sloan, S.M.; Schultz, R.R.; Wilson, G.E.

    1996-10-01

    Summaries of RELAP5/MOD3 code assessments, a listing of the assessment matrix, and a chronology of the various versions of the code are given. Results from these code assessments have been used to formulate a compilation of some of the strengths and weaknesses of the code. These results are documented in the report. Volume 7 was designed to be updated periodically and to include the results of the latest code assessments as they become available. Consequently, users of Volume 7 should ensure that they have the latest revision available.

  8. The investigation of bandwidth efficient coding and modulation techniques

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The New Mexico State University Center for Space Telemetering and Telecommunications systems has been, and is currently, engaged in the investigation of trellis-coded modulation (TCM) communication systems. In particular, TCM utilizing M-ary phase shift keying is being studied. The study of carrier synchronization in a TCM environment, or in MPSK systems in general, has been one of the two main thrusts of this grant. This study has involved both theoretical modelling and software simulation of the carrier synchronization problem.

  9. A 2.9 ps equivalent resolution interpolating time counter based on multiple independent coding lines

    NASA Astrophysics Data System (ADS)

    Szplet, R.; Jachna, Z.; Kwiatkowski, P.; Rozyc, K.

    2013-03-01

    We present the design, operation and test results of a time counter that has an equivalent resolution of 2.9 ps, a measurement uncertainty at the level of 6 ps, and a measurement range of 10 s. The time counter has been implemented in a general-purpose reprogrammable device Spartan-6 (Xilinx). To obtain both high precision and wide measurement range the counting of periods of a reference clock is combined with a two-stage interpolation within a single period of the clock signal. The interpolation involves a four-phase clock in the first interpolation stage (FIS) and an equivalent coding line (ECL) in the second interpolation stage (SIS). The ECL is created as a compound of independent discrete time coding lines (TCL). The number of TCLs used to create the virtual ECL has an effect on its resolution. We tested ECLs made from up to 16 TCLs, but the idea may be extended to a larger number of lines. In the presented time counter the coarse resolution of the counting method equal to 2 ns (period of the 500 MHz reference clock) is firstly improved fourfold in the FIS and next even more than 400 times in the SIS. The proposed solution allows us to overcome the technological limitation in achievable resolution and improve the precision of conversion of integrated interpolators based on tapped delay lines.

  10. Independent assessment of TRAC and RELAP5 codes through separate effects tests

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.; Pu, J.

    1983-01-01

    Independent assessment of TRAC-PF1 (Version 7.0), TRAC-BD1 (Version 12.0) and RELAP5/MOD1 (Cycle 14) that was initiated at BNL in FY 1982, has been completed in FY 1983. As in the previous years, emphasis at Brookhaven has been in simulating various separate-effects tests with these advanced codes and identifying the areas where further thermal-hydraulic modeling improvements are needed. The following six catetories of tests were simulated with the above codes: (1) critical flow tests (Moby-Dick nitrogen-water, BNL flashing flow, Marviken Test 24); (2) Counter-Current Flow Limiting (CCFL) tests (University of Houston, Dartmouth College single and parallel tube test); (3) level swell tests (G.E. large vessel test); (4) steam generator tests (B and W 19-tube model S.G. tests, FLECHT-SEASET U-tube S.G. tests); (5) natural circulation tests (FRIGG loop tests); and (6) post-CHF tests (Oak Ridge steady-state test).

  11. Independent code assessment at BNL in FY 1982. [TRAC-PF1; RELAP5/MOD1; TRAC-BD1

    SciTech Connect

    Saha, P.; Rohatgi, U.S.; Jo, J.H.; Neymotin, L.; Slovik, G.; Yuelys-Miksis, C.

    1982-01-01

    Independent assessment of the advanced codes such as TRAC and RELAP5 has continued at BNL through the Fiscal Year 1982. The simulation tests can be grouped into the following five categories: critical flow, counter-current flow limiting (CCFL) or flooding, level swell, steam generator thermal performance, and natural circulation. TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed by simulating all of the above experiments, whereas the TRAC-BD1 (Version 12.0) code was applied only to the CCFL tests. Results and conclusions of the BNL code assessment activity of FY 1982 are summarized below.

  12. Investigating Lossy Image Coding Using the PLHaar Transform

    SciTech Connect

    Senecal, J G; Lindstrom, P; Duchaineau, M A; Joy, K I

    2004-11-16

    We developed the Piecewise-Linear Haar (PLHaar) transform, an integer wavelet-like transform. PLHaar does not have dynamic range expansion, i.e. it is an n-bit to n-bit transform. To our knowledge PLHaar is the only reversible n-bit to n-bit transform that is suitable for lossy and lossless coding. We are investigating PLHaar's use in lossy image coding. Preliminary results from thresholding transform coefficients show that PLHaar does not produce objectionable artifacts like prior n-bit to n-bit transforms, such as the transform of Chao et al. (CFH). Also, at lower bitrates PLHaar images have increased contrast. For a given set of CFH and PLHaar coefficients with equal entropy, the PLHaar reconstruction is more appealing, although the PSNR may be lower.

  13. Enabling Handicapped Nonreaders to Independently Obtain Information: Initial Development of an Inexpensive Bar Code Reader System.

    ERIC Educational Resources Information Center

    VanBiervliet, Alan

    A project to develop and evaluate a bar code reader system as a self-directed information and instructional aid for handicapped nonreaders is described. The bar code technology involves passing a light sensitive pen or laser over a printed code with bars which correspond to coded numbers. A system would consist of a portable device which could…

  14. Norepinephrine Modulates Coding of Complex Vocalizations in the Songbird Auditory Cortex Independent of Local Neuroestrogen Synthesis.

    PubMed

    Ikeda, Maaya Z; Jeon, Sung David; Cowell, Rosemary A; Remage-Healey, Luke

    2015-06-24

    The catecholamine norepinephrine plays a significant role in auditory processing. Most studies to date have examined the effects of norepinephrine on the neuronal response to relatively simple stimuli, such as tones and calls. It is less clear how norepinephrine shapes the detection of complex syntactical sounds, as well as the coding properties of sensory neurons. Songbirds provide an opportunity to understand how auditory neurons encode complex, learned vocalizations, and the potential role of norepinephrine in modulating the neuronal computations for acoustic communication. Here, we infused norepinephrine into the zebra finch auditory cortex and performed extracellular recordings to study the modulation of song representations in single neurons. Consistent with its proposed role in enhancing signal detection, norepinephrine decreased spontaneous activity and firing during stimuli, yet it significantly enhanced the auditory signal-to-noise ratio. These effects were all mimicked by clonidine, an α-2 receptor agonist. Moreover, a pattern classifier analysis indicated that norepinephrine enhanced the ability of single neurons to accurately encode complex auditory stimuli. Because neuroestrogens are also known to enhance auditory processing in the songbird brain, we tested the hypothesis that norepinephrine actions depend on local estrogen synthesis. Neither norepinephrine nor adrenergic receptor antagonist infusion into the auditory cortex had detectable effects on local estradiol levels. Moreover, pretreatment with fadrozole, a specific aromatase inhibitor, did not block norepinephrine's neuromodulatory effects. Together, these findings indicate that norepinephrine enhances signal detection and information encoding for complex auditory stimuli by suppressing spontaneous "noise" activity and that these actions are independent of local neuroestrogen synthesis. PMID:26109659

  15. High performance computing aspects of a dimension independent semi-Lagrangian discontinuous Galerkin code

    NASA Astrophysics Data System (ADS)

    Einkemmer, Lukas

    2016-05-01

    The recently developed semi-Lagrangian discontinuous Galerkin approach is used to discretize hyperbolic partial differential equations (usually first order equations). Since these methods are conservative, local in space, and able to limit numerical diffusion, they are considered a promising alternative to more traditional semi-Lagrangian schemes (which are usually based on polynomial or spline interpolation). In this paper, we consider a parallel implementation of a semi-Lagrangian discontinuous Galerkin method for distributed memory systems (so-called clusters). Both strong and weak scaling studies are performed on the Vienna Scientific Cluster 2 (VSC-2). In the case of weak scaling we observe a parallel efficiency above 0.8 for both two and four dimensional problems and up to 8192 cores. Strong scaling results show good scalability to at least 512 cores (we consider problems that can be run on a single processor in reasonable time). In addition, we study the scaling of a two dimensional Vlasov-Poisson solver that is implemented using the framework provided. All of the simulations are conducted in the context of worst case communication overhead; i.e., in a setting where the CFL (Courant-Friedrichs-Lewy) number increases linearly with the problem size. The framework introduced in this paper facilitates a dimension independent implementation of scientific codes (based on C++ templates) using both an MPI and a hybrid approach to parallelization. We describe the essential ingredients of our implementation.

  16. A coding-independent function of an alternative Ube3a transcript during neuronal development.

    PubMed

    Valluy, Jeremy; Bicker, Silvia; Aksoy-Aksel, Ayla; Lackinger, Martin; Sumer, Simon; Fiore, Roberto; Wüst, Tatjana; Seffer, Dominik; Metge, Franziska; Dieterich, Christoph; Wöhr, Markus; Schwarting, Rainer; Schratt, Gerhard

    2015-05-01

    The E3 ubiquitin ligase Ube3a is an important regulator of activity-dependent synapse development and plasticity. Ube3a mutations cause Angelman syndrome and have been associated with autism spectrum disorders (ASD). However, the biological significance of alternative Ube3a transcripts generated in mammalian neurons remains unknown. We report here that Ube3a1 RNA, a transcript that encodes a truncated Ube3a protein lacking catalytic activity, prevents exuberant dendrite growth and promotes spine maturation in rat hippocampal neurons. Surprisingly, Ube3a1 RNA function was independent of its coding sequence but instead required a unique 3' untranslated region and an intact microRNA pathway. Ube3a1 RNA knockdown increased activity of the plasticity-regulating miR-134, suggesting that Ube3a1 RNA acts as a dendritic competing endogenous RNA. Accordingly, the dendrite-growth-promoting effect of Ube3a1 RNA knockdown in vivo is abolished in mice lacking miR-134. Taken together, our results define a noncoding function of an alternative Ube3a transcript in dendritic protein synthesis, with potential implications for Angelman syndrome and ASD. PMID:25867122

  17. An investigation of error characteristics and coding performance

    NASA Technical Reports Server (NTRS)

    Ebel, William J.; Ingels, Frank M.

    1992-01-01

    The performance of forward error correcting coding schemes on errors anticipated for the Earth Observation System (EOS) Ku-band downlink are studied. The EOS transmits picture frame data to the ground via the Telemetry Data Relay Satellite System (TDRSS) to a ground-based receiver at White Sands. Due to unintentional RF interference from other systems operating in the Ku band, the noise at the receiver is non-Gaussian which may result in non-random errors output by the demodulator. That is, the downlink channel cannot be modeled by a simple memoryless Gaussian-noise channel. From previous experience, it is believed that those errors are bursty. The research proceeded by developing a computer based simulation, called Communication Link Error ANalysis (CLEAN), to model the downlink errors, forward error correcting schemes, and interleavers used with TDRSS. To date, the bulk of CLEAN was written, documented, debugged, and verified. The procedures for utilizing CLEAN to investigate code performance were established and are discussed.

  18. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning

    NASA Astrophysics Data System (ADS)

    Fracchiolla, F.; Lorentini, S.; Widesott, L.; Schwarz, M.

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ -Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ -PR  >  93%) than the one coming from TPS (γ -PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process.

  19. Investigation of Navier-Stokes Code Verification and Design Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar

    2004-01-01

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a freer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the xi concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-6 turbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi-objective optimization

  20. Investigation of Navier-Stokes code verification and design optimization

    NASA Astrophysics Data System (ADS)

    Vaidyanathan, Rajkumar

    With rapid progress made in employing computational techniques for various complex Navier-Stokes fluid flow problems, design optimization problems traditionally based on empirical formulations and experiments are now being addressed with the aid of computational fluid dynamics (CFD). To be able to carry out an effective CFD-based optimization study, it is essential that the uncertainty and appropriate confidence limits of the CFD solutions be quantified over the chosen design space. The present dissertation investigates the issues related to code verification, surrogate model-based optimization and sensitivity evaluation. For Navier-Stokes (NS) CFD code verification a least square extrapolation (LSE) method is assessed. This method projects numerically computed NS solutions from multiple, coarser base grids onto a finer grid and improves solution accuracy by minimizing the residual of the discretized NS equations over the projected grid. In this dissertation, the finite volume (FV) formulation is focused on. The interplay between the concepts and the outcome of LSE, and the effects of solution gradients and singularities, nonlinear physics, and coupling of flow variables on the effectiveness of LSE are investigated. A CFD-based design optimization of a single element liquid rocket injector is conducted with surrogate models developed using response surface methodology (RSM) based on CFD solutions. The computational model consists of the NS equations, finite rate chemistry, and the k-epsilonturbulence closure. With the aid of these surrogate models, sensitivity and trade-off analyses are carried out for the injector design whose geometry (hydrogen flow angle, hydrogen and oxygen flow areas and oxygen post tip thickness) is optimized to attain desirable goals in performance (combustion length) and life/survivability (the maximum temperatures on the oxidizer post tip and injector face and a combustion chamber wall temperature). A preliminary multi

  1. A Social Cognitive Investigation of Australian Independent School Boards as Teams

    ERIC Educational Resources Information Center

    Krishnan, Aparna; Barnett, Kerry; McCormick, John; Newcombe, Geoffrey

    2016-01-01

    Purpose: The purpose of this paper is to investigate independent school Boards as teams using a social cognitive perspective. Specifically, the study investigated Board processes and the nature of relationships between Board member self-efficacy, Board collective efficacy and performance of independent school Boards in New South Wales, Australia.…

  2. Characterization and validation of a Monte Carlo code for independent dose calculation in proton therapy treatments with pencil beam scanning.

    PubMed

    Fracchiolla, F; Lorentini, S; Widesott, L; Schwarz, M

    2015-11-01

    We propose a method of creating and validating a Monte Carlo (MC) model of a proton Pencil Beam Scanning (PBS) machine using only commissioning measurements and avoiding the nozzle modeling. Measurements with a scintillating screen coupled with a CCD camera, ionization chamber and a Faraday Cup were used to model the beam in TOPAS without using any machine parameter information but the virtual source distance from the isocenter. Then the model was validated on simple Spread Out Bragg Peaks (SOBP) delivered in water phantom and with six realistic clinical plans (many involving 3 or more fields) on an anthropomorphic phantom. In particular the behavior of the moveable Range Shifter (RS) feature was investigated and its modeling has been proposed. The gamma analysis (3%,3 mm) was used to compare MC, TPS (XiO-ELEKTA) and measured 2D dose distributions (using radiochromic film). The MC modeling proposed here shows good results in the validation phase, both for simple irradiation geometry (SOBP in water) and for modulated treatment fields (on anthropomorphic phantoms). In particular head lesions were investigated and both MC and TPS data were compared with measurements. Treatment plans with no RS always showed a very good agreement with both of them (γ-Passing Rate (PR)  >  95%). Treatment plans in which the RS was needed were also tested and validated. For these treatment plans MC results showed better agreement with measurements (γ-PR  >  93%) than the one coming from TPS (γ-PR  <  88%). This work shows how to simplify the MC modeling of a PBS machine for proton therapy treatments without accounting for any hardware components and proposes a more reliable RS modeling than the one implemented in our TPS. The validation process has shown how this code is a valid candidate for a completely independent treatment plan dose calculation algorithm. This makes the code an important future tool for the patient specific QA verification process. PMID

  3. Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs

    PubMed Central

    Tay, Yvonne; Kats, Lev; Salmena, Leonardo; Weiss, Dror; Tan, Shen Mynn; Ala, Ugo; Karreth, Florian; Poliseno, Laura; Provero, Paolo; Di Cunto, Ferdinando; Lieberman, Judy; Rigoutsos, Isidore; Pandolfi, Pier Paolo

    2011-01-01

    SUMMARY Here we demonstrate that protein-coding RNA transcripts can crosstalk by competing for common microRNAs, with microRNA response elements as the foundation of this interaction. We have termed such RNA transcripts as competing endogenous RNAs (ceRNAs). We tested this hypothesis in the context of PTEN, a key tumor suppressor whose abundance determines critical outcomes in tumorigenesis. By a combined computational and experimental approach, we identified and validated endogenous protein-coding transcripts that regulate PTEN, antagonize PI3K/AKT signaling and possess growth and tumor suppressive properties. Notably, we also show that these genes display concordant expression patterns with PTEN and copy number loss in cancers. Our study presents a road map for the prediction and validation of ceRNA activity and networks, and thus imparts a trans-regulatory function to protein-coding mRNAs. PMID:22000013

  4. tRNA acceptor stem and anticodon bases form independent codes related to protein folding

    PubMed Central

    Carter, Charles W.; Wolfenden, Richard

    2015-01-01

    Aminoacyl-tRNA synthetases recognize tRNA anticodon and 3′ acceptor stem bases. Synthetase Urzymes acylate cognate tRNAs even without anticodon-binding domains, in keeping with the possibility that acceptor stem recognition preceded anticodon recognition. Representing tRNA identity elements with two bits per base, we show that the anticodon encodes the hydrophobicity of each amino acid side-chain as represented by its water-to-cyclohexane distribution coefficient, and this relationship holds true over the entire temperature range of liquid water. The acceptor stem codes preferentially for the surface area or size of each side-chain, as represented by its vapor-to-cyclohexane distribution coefficient. These orthogonal experimental properties are both necessary to account satisfactorily for the exposed surface area of amino acids in folded proteins. Moreover, the acceptor stem codes correctly for β-branched and carboxylic acid side-chains, whereas the anticodon codes for a wider range of such properties, but not for size or β-branching. These and other results suggest that genetic coding of 3D protein structures evolved in distinct stages, based initially on the size of the amino acid and later on its compatibility with globular folding in water. PMID:26034281

  5. Coding tools investigation for next generation video coding based on HEVC

    NASA Astrophysics Data System (ADS)

    Chen, Jianle; Chen, Ying; Karczewicz, Marta; Li, Xiang; Liu, Hongbin; Zhang, Li; Zhao, Xin

    2015-09-01

    The new state-of-the-art video coding standard, H.265/HEVC, has been finalized in 2013 and it achieves roughly 50% bit rate saving compared to its predecessor, H.264/MPEG-4 AVC. This paper provides the evidence that there is still potential for further coding efficiency improvements. A brief overview of HEVC is firstly given in the paper. Then, our improvements on each main module of HEVC are presented. For instance, the recursive quadtree block structure is extended to support larger coding unit and transform unit. The motion information prediction scheme is improved by advanced temporal motion vector prediction, which inherits the motion information of each small block within a large block from a temporal reference picture. Cross component prediction with linear prediction model improves intra prediction and overlapped block motion compensation improves the efficiency of inter prediction. Furthermore, coding of both intra and inter prediction residual is improved by adaptive multiple transform technique. Finally, in addition to deblocking filter and SAO, adaptive loop filter is applied to further enhance the reconstructed picture quality. This paper describes above-mentioned techniques in detail and evaluates their coding performance benefits based on the common test condition during HEVC development. The simulation results show that significant performance improvement over HEVC standard can be achieved, especially for the high resolution video materials.

  6. Investigation of a panel code for airframe/propeller integration analyses

    NASA Technical Reports Server (NTRS)

    Miley, S. J.

    1982-01-01

    The Hess panel code was investigated as a procedure to predict the aerodynamic loading associated with propeller slipstream interference on the airframe. The slipstream was modeled as a variable onset flow to the lifting and nonlifting bodies treated by the code. Four sets of experimental data were used for comparisons with the code. The results indicate that the Hess code, in its present form, will give valid solutions for nonuniform onset flows which vary in direction only. The code presently gives incorrect solutions for flows with variations in velocity. Modifications to the code to correct this are discussed.

  7. Multigroup Time-Independent Neutron Transport Code System for Plane or Spherical Geometry.

    Energy Science and Technology Software Center (ESTSC)

    1986-12-01

    Version 00 PALLAS-PL/SP solves multigroup time-independent one-dimensional neutron transport problems in plane or spherical geometry. The problems solved are subject to a variety of boundary conditions or a distributed source. General anisotropic scattering problems are treated for solving deep-penetration problems in which angle-dependent neutron spectra are calculated in detail.

  8. Investigating the Language and Literacy Skills Required for Independent Online Learning

    ERIC Educational Resources Information Center

    Silver-Pacuilla, Heidi

    2008-01-01

    This investigation was undertaken to investigate the threshold levels of literacy and language proficiency necessary for adult learners to use the Internet for independent learning. The report is triangulated around learning from large-scale surveys, learning from the literature, and learning from the field. Reported findings include: (1)…

  9. Investigation of Bandwidth-Efficient Coding and Modulation Techniques

    NASA Technical Reports Server (NTRS)

    Osborne, William P.

    1992-01-01

    The necessary technology was studied to improve the bandwidth efficiency of the space-to-ground communications network using the current capabilities of that network as a baseline. The study was aimed at making space payloads, for example the Hubble Space Telescope, more capable without the need to completely redesign the link. Particular emphasis was placed on the following concepts: (1) what the requirements are which are necessary to convert an existing standard 4-ary phase shift keying communications link to one that can support, as a minimum, 8-ary phase shift keying with error corrections applied; and (2) to determine the feasibility of using the existing equipment configurations with additional signal processing equipment to realize the higher order modulation and coding schemes.

  10. ADF95: Tool for automatic differentiation of a FORTRAN code designed for large numbers of independent variables

    NASA Astrophysics Data System (ADS)

    Straka, Christian W.

    2005-06-01

    ADF95 is a tool to automatically calculate numerical first derivatives for any mathematical expression as a function of user defined independent variables. Accuracy of derivatives is achieved within machine precision. ADF95 may be applied to any FORTRAN 77/90/95 conforming code and requires minimal changes by the user. It provides a new derived data type that holds the value and derivatives and applies forward differencing by overloading all FORTRAN operators and intrinsic functions. An efficient indexing technique leads to a reduced memory usage and a substantially increased performance gain over other available tools with operator overloading. This gain is especially pronounced for sparse systems with large number of independent variables. A wide class of numerical simulations, e.g., those employing implicit solvers, can profit from ADF95. Program summaryTitle of program:ADF95 Catalogue identifier: ADVI Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVI Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed: all platforms with a FORTRAN 95 compiler Programming language used:FORTRAN 95 No. of lines in distributed program, including test data, etc.: 3103 No. of bytes in distributed program, including test data, etc.: 9862 Distribution format: tar.gz Nature of problem: In many areas in the computational sciences first order partial derivatives for large and complex sets of equations are needed with machine precision accuracy. For example, any implicit or semi-implicit solver requires the computation of the Jacobian matrix, which contains the first derivatives with respect to the independent variables. ADF95 is a software module to facilitate the automatic computation of the first partial derivatives of any arbitrarily complex mathematical FORTRAN expression. The program exploits the sparsity inherited by many set of equations thereby enabling faster computations compared to alternate

  11. Independent assessment of TRAC-PD2 and RELAP5/MOD1 codes at BNL in FY 1981. [PWR

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G

    1982-12-01

    This report documents the independent assessment calculations performed with the TRAC-PD2 and RELAP/MOD1 codes at Brookhaven National Laboratory (BNL) during Fiscal Year 1981. A large variety of separate-effects experiments dealing with (1) steady-state and transient critical flow, (2) level swell, (3) flooding and entrainment, (4) steady-state flow boiling, (5) integral economizer once-through steam generator (IEOTSG) performance, (6) bottom reflood, and (7) two-dimensional phase separation of two-phase mixtures were simulated with TRAC-PD2. In addition, the early part of an overcooling transient which occurred at the Rancho Seco nuclear power plant on March 20, 1978 was also computed with an updated version of TRAC-PD2. Three separate-effects tests dealing with (1) transient critical flow, (2) steady-state flow boiling, and (3) IEOTSG performance were also simulated with RELAP5/MOD1 code. Comparisons between the code predictions and the test data are presented.

  12. Investigations with methanobacteria and with evolution of the genetic code

    NASA Technical Reports Server (NTRS)

    Jukes, T. H.

    1986-01-01

    Mycoplasma capricolum was found by Osawa et al. to use UGA as the code of tryptophan and to contain 75% A + T in its DNA. This change could have been from evolutionary pressure to replace C + G by A + T. Numerous studies have been reported of evolution of proteins as measured by amino acid replacements that are observed when homologus proteins, such as hemoglobins from various vertebrates, are compared. These replacements result from nucleotide substitutions in amino acid codons in the corresponding genes. Simultaneously, silent nucleotide substitutions take place that can be studied when sequences of the genes are compared. These silent evolutionary changes take place mostly in third positions of codons. Two types of nucleotide substitutions are recognized: pyrimidine-pyrimidine and purine-purine interchanges (transitions) and pyriidine-purine interchanges (transversions). Silent transitions are favored when a corresponding transversion would produce an amino acid replacement. Conversely, silent transversions are favored by probability when transitions and transversions will both be silent. Extensive examples of these situations have been found in protein genes, and it is evident that transversions in silent positions predominate in family boxes in most of the examples studied. In associated research a streptomycete from cow manure was found to produce an extracellular enzyme capable of lysing the pseudomurein-contining methanogen Methanobacterium formicicum.

  13. Approaches to Learning at Work: Investigating Work Motivation, Perceived Workload, and Choice Independence

    ERIC Educational Resources Information Center

    Kyndt, Eva; Raes, Elisabeth; Dochy, Filip; Janssens, Els

    2013-01-01

    Learning and development are taking up a central role in the human resource policies of organizations because of their crucial contribution to the competitiveness of those organizations. The present study investigates the relationship of work motivation, perceived workload, and choice independence with employees' approaches to learning at…

  14. Modality independence of order coding in working memory: Evidence from cross-modal order interference at recall.

    PubMed

    Vandierendonck, André

    2016-01-01

    Working memory researchers do not agree on whether order in serial recall is encoded by dedicated modality-specific systems or by a more general modality-independent system. Although previous research supports the existence of autonomous modality-specific systems, it has been shown that serial recognition memory is prone to cross-modal order interference by concurrent tasks. The present study used a serial recall task, which was performed in a single-task condition and in a dual-task condition with an embedded memory task in the retention interval. The modality of the serial task was either verbal or visuospatial, and the embedded tasks were in the other modality and required either serial or item recall. Care was taken to avoid modality overlaps during presentation and recall. In Experiment 1, visuospatial but not verbal serial recall was more impaired when the embedded task was an order than when it was an item task. Using a more difficult verbal serial recall task, verbal serial recall was also more impaired by another order recall task in Experiment 2. These findings are consistent with the hypothesis of modality-independent order coding. The implications for views on short-term recall and the multicomponent view of working memory are discussed. PMID:25801664

  15. Investigation of perception-oriented coding techniques for video compression based on large block structures

    NASA Astrophysics Data System (ADS)

    Kaprykowsky, Hagen; Doshkov, Dimitar; Hoffmann, Christoph; Ndjiki-Nya, Patrick; Wiegand, Thomas

    2011-09-01

    Recent investigations have shown that one of the most beneficial elements for higher compression performance in highresolution video is the incorporation of larger block structures. In this work, we will address the question of how to incorporate perceptual aspects into new video coding schemes based on large block structures. This is rooted in the fact that especially high frequency regions such as textures yield high coding costs when using classical prediction modes as well as encoder control based on the mean squared error. To overcome this problem, we will investigate the incorporation of novel intra predictors based on image completion methods. Furthermore, the integration of a perceptualbased encoder control using the well-known structural similarity index will be analyzed. A major aspect of this article is the evaluation of the coding results in a quantitative (i.e. statistical analysis of changes in mode decisions) as well as qualitative (i.e. coding efficiency) manner.

  16. Investigation of the Use of Erasures in a Concatenated Coding Scheme

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.; Marriott, Philip J.

    1997-01-01

    A new method for declaring erasures in a concatenated coding scheme is investigated. This method is used with the rate 1/2 K = 7 convolutional code and the (255, 223) Reed Solomon code. Errors and erasures Reed Solomon decoding is used. The erasure method proposed uses a soft output Viterbi algorithm and information provided by decoded Reed Solomon codewords in a deinterleaving frame. The results show that a gain of 0.3 dB is possible using a minimum amount of decoding trials.

  17. Triplet code-independent programming of living systems organisation by DNA: the link with intelligence and memory.

    PubMed

    Adams, D H

    1995-05-01

    Previous suggestions from this laboratory (3), (a) that within its molecular electronic structure, DNA houses a computer-analog program of immense complexity, operating independently of, but complementary to, triplet coding and (b) that, inter alia, this program is the driving force for organising and executing the construction of species individuals in three dimensions, are extended in the present communication. It is now concluded that the DNA program also embodies an 'intelligence' component, which extends its organising ability both qualitatively and quantitatively beyond any of the heavily circumscribed 'self-organising' attributes claimed to be associated with naturally occurring inanimate systems. Further, that as part of the developmental process, a program component organises the fabrication of mammalian central nervous systems, including that of human beings with the associated attributes of intelligence, creativity and constructional skills. It is further suggested that the sophisticated random access memory system associated with human beings in particular may be explicable in terms of an extension of the DNA programming system: basically this involves the latter operating as computer-type 'hardware' for the storage of long-term memory and interacting with, primarily, glial cell RNA, acting as 'software' and storing short term traces. Finally, it is suggested that such an interrelationship between DNA/RNA molecular electronic structures can provide the necessary memory storage capacity and flexibility and also facilitates random access to the long-term DNA memory store. PMID:8583976

  18. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    NASA Astrophysics Data System (ADS)

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-01

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  19. An Early Underwater Artificial Vision Model in Ocean Investigations via Independent Component Analysis

    PubMed Central

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  20. An early underwater artificial vision model in ocean investigations via independent component analysis.

    PubMed

    Nian, Rui; Liu, Fang; He, Bo

    2013-01-01

    Underwater vision is one of the dominant senses and has shown great prospects in ocean investigations. In this paper, a hierarchical Independent Component Analysis (ICA) framework has been established to explore and understand the functional roles of the higher order statistical structures towards the visual stimulus in the underwater artificial vision system. The model is inspired by characteristics such as the modality, the redundancy reduction, the sparseness and the independence in the early human vision system, which seems to respectively capture the Gabor-like basis functions, the shape contours or the complicated textures in the multiple layer implementations. The simulation results have shown good performance in the effectiveness and the consistence of the approach proposed for the underwater images collected by autonomous underwater vehicles (AUVs). PMID:23863855

  1. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-07-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  2. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Finite Difference Time Domain Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). This manual provides a description of the code and corresponding results for the default scattering problem. In addition to the description, the operation, resource requirements, version A code capabilities, a description of each subroutine, a brief discussion of the radar cross section computations, and a discussion of the scattering results.

  3. Investigation of the image coding method for three-dimensional range-gated imaging

    NASA Astrophysics Data System (ADS)

    Laurenzis, Martin; Bacher, Emmanuel; Schertzer, Stéphane; Christnacher, Frank

    2011-11-01

    In this publication we investigate the image coding method for 3D range-gated imaging. This method is based on multiple exposure of range-gated images to enable a coding of ranges in a limited number of images. For instance, it is possible to enlarge the depth mapping range by a factor of 12 by the utilization of 3 images and specific 12T image coding sequences. Further, in this paper we present a node-model to determine the coding sequences and to dramatically reduce the time of calculation of the number of possible sequences. Finally, we demonstrate and discuss the application of 12T sequences with different clock periods T = 200 ns to 400 ns.

  4. Registered report: A coding-independent function of gene and pseudogene mRNAs regulates tumour biology

    PubMed Central

    Khan, Israr; Kerwin, John; Owen, Kate; Griner, Erin; Iorns, Elizabeth

    2015-01-01

    The Reproducibility Project: Cancer Biology seeks to address growing concerns about reproducibility in scientific research by conducting replications of selected experiments from a number of high-profile papers in the field of cancer biology. The papers, which were published between 2010 and 2012, were selected on the basis of citations and Altmetric scores (Errington et al., 2014). This Registered report describes the proposed replication plan of key experiments from ‘A coding-independent function of gene and pseudogene mRNAs regulates tumour biology’ by Poliseno et al. (2010), published in Nature in 2010. The key experiments to be replicated are reported in Figures 1D, 2F-H, and 4A. In these experiments, Poliseno and colleagues report microRNAs miR-19b and miR-20a transcriptionally suppress both PTEN and PTENP1 in prostate cancer cells (Figure 1D; Poliseno et al., 2010). Decreased expression of PTEN and/or PTENP1 resulted in downregulated PTEN protein levels (Figure 2H), downregulation of both mRNAs (Figure 2G), and increased tumor cell proliferation (Figure 2F; Poliseno et al., 2010). Furthermore, overexpression of the PTEN 3′ UTR enhanced PTENP1 mRNA abundance limiting tumor cell proliferation, providing additional evidence for the co-regulation of PTEN and PTENP1 (Figure 4A; Poliseno et al., 2010). The Reproducibility Project: Cancer Biology is collaboration between the Center for Open Science and Science Exchange, and the results of the replications will be published in eLife. DOI: http://dx.doi.org/10.7554/eLife.08245.001 PMID:26335297

  5. Registered report: Coding-independent regulation of the tumor suppressor PTEN by competing endogenous mRNAs

    PubMed Central

    Phelps, Mitch; Coss, Chris; Wang, Hongyan; Cook, Matthew

    2016-01-01

    The Reproducibility Project: Cancer Biology seeks to address growing concerns about reproducibility in scientific research by conducting replications of selected experiments from a number of high-profile papers in the field of cancer biology. The papers, which were published between 2010 and 2012, were selected on the basis of citations and Altmetric scores (Errington et al., 2014). This Registered Report describes the proposed replication plan of key experiments from “Coding-Independent Regulation of the Tumor Suppressor PTEN by Competing Endogenous 'mRNAs' by Tay and colleagues, published in Cell in 2011 (Tay et al., 2011). The experiments to be replicated are those reported in Figures 3C, 3D, 3G, 3H, 5A and 5B, and in Supplemental Figures 3A and B. Tay and colleagues proposed a new regulatory mechanism based on competing endogenous RNAs (ceRNAs), which regulate target genes by competitive binding of shared microRNAs. They test their model by identifying and confirming ceRNAs that target PTEN. In Figure 3A and B, they report that perturbing expression of putative PTEN ceRNAs affects expression of PTEN. This effect is dependent on functional microRNA machinery (Figure 3G and H), and affects the pathway downstream of PTEN itself (Figures 5A and B). The Reproducibility Project: Cancer Biology is a collaboration between the Center for Open Science and Science Exchange, and the results of the replications will be published by eLife. DOI: http://dx.doi.org/10.7554/eLife.12470.001 PMID:26943900

  6. A Coding System with Independent Annotations of Gesture Forms and Functions during Verbal Communication: Development of a Database of Speech and GEsture (DoSaGE)

    PubMed Central

    Kong, Anthony Pak-Hin; Law, Sam-Po; Kwan, Connie Ching-Yin; Lai, Christy; Lam, Vivian

    2014-01-01

    Gestures are commonly used together with spoken language in human communication. One major limitation of gesture investigations in the existing literature lies in the fact that the coding of forms and functions of gestures has not been clearly differentiated. This paper first described a recently developed Database of Speech and GEsture (DoSaGE) based on independent annotation of gesture forms and functions among 119 neurologically unimpaired right-handed native speakers of Cantonese (divided into three age and two education levels), and presented findings of an investigation examining how gesture use was related to age and linguistic performance. Consideration of these two factors, for which normative data are currently very limited or lacking in the literature, is relevant and necessary when one evaluates gesture employment among individuals with and without language impairment. Three speech tasks, including monologue of a personally important event, sequential description, and story-telling, were used for elicitation. The EUDICO Linguistic ANnotator (ELAN) software was used to independently annotate each participant’s linguistic information of the transcript, forms of gestures used, and the function for each gesture. About one-third of the subjects did not use any co-verbal gestures. While the majority of gestures were non-content-carrying, which functioned mainly for reinforcing speech intonation or controlling speech flow, the content-carrying ones were used to enhance speech content. Furthermore, individuals who are younger or linguistically more proficient tended to use fewer gestures, suggesting that normal speakers gesture differently as a function of age and linguistic performance. PMID:25667563

  7. Investigation of liquid crystal Fabry-Perot tunable filters: design, fabrication, and polarization independence.

    PubMed

    Isaacs, Sivan; Placido, Frank; Abdulhalim, Ibrahim

    2014-10-10

    Liquid crystal Fabry-Perot tunable filters are investigated in detail, with special attention to their manufacturability, design, tolerances, and polarization independence. The calculations were performed both numerically and analytically using the 4×4 propagation matrix method. A simplified analytic expression for the propagation matrix is derived for the case of nematic LC in the homogeneous geometry. At normal incidence, it is shown that one can use the 2×2 Abeles matrix method; however, at oblique incidence, the 4×4 matrix method is needed. The effects of dephasing originating from wedge or noncollimated light beams are investigated. Due to the absorption of the indium tin oxide layer and as an electrode, its location within the mirror multilayered stack is very important. The optimum location is found to be within the stack and not on its top or bottom. Finally, we give more detailed experimental results of our polarization-independent configuration that uses polarization diversity with a Wollaston prism. PMID:25322437

  8. Final report of the independent counsel for Iran/Contra matters. Volume 1: Investigations and prosecutions

    SciTech Connect

    Walsh, L.E.

    1993-08-04

    In October and November 1986, two secret U.S. Government operations were publicly exposed, potentially implicating Reagan Administration officials in illegal activities. These operations were the provision of assistance to the military activities of the Nicaraguan contra rebels during an October 1984 to October 1986 prohibition on such aid, and the sale of U.S. arms to Iran in contravention of stated U.S. policy and in possible violation of arms-export controls. In late November 1986, Reagan Administration officials announced that some of the proceeds from the sale of U.S. arms to Iran had been diverted to the contras. As a result of the exposure of these operations, Attorney General Edwin Meese III sought the appointment of an independent counsel to investigate and, if necessary, prosecute possible crimes arising from them. This is the final report of that investigation.

  9. Investigation of coding techniques for memory and delay efficient interleaving in slow Rayleigh fading

    NASA Astrophysics Data System (ADS)

    Strater, Jay W.

    1991-11-01

    High data rate communication links operating under slow fading channel conditions may have interleaving memory requirements which are too large for practical applications. These requirements can be reduced by employing spacial diversity; however, a less costly alternative is to select coding and interleaving techniques that support memory efficient interleaving. The objective of this investigation has been to find coding and interleaving techniques with relatively small interleaving memory requirements and to accurately quantify these requirements. Toward this objective, convolutional and Ree-Solomon coding with single-stage and concatenated code configurations were evaluated with convolutional interleaving and differential phase shift keying (DPSK) modulation to determine their interleaving memory requirements. Code performance for these link selections was computed by high-fidelity link simulations and approximations over a wide range of E(sub b)/N(0) and interleaver span to scintillation decorrelation times (T(sub il)/Tau(0)) and the results of these evaluations were converted to interleaving memory requirements. Interleaving delay requirements were also determined and code selections with low interleaving memory and delay requirements were identified.

  10. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Astrophysics Data System (ADS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-07-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  11. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three dimensional numerical electromagnetic scattering code based upon the Finite Difference Time Domain Technique (FDTD). The supplied version of the code is one version of our current three dimensional FDTD code set. This manual provides a description of the code and corresponding results for several scattering problems. The manual is organized into fourteen sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing Radar Cross Section (RCS) computations, a section discussing some scattering results, a sample problem setup section, a new problem checklist, references and figure titles.

  12. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional electromagnetic scattering codes based on the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  13. User's manual for two dimensional FDTD version TEA and TMA codes for scattering from frequency-independent dielectic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1991-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Versions TEA and TMA are two dimensional numerical electromagnetic scattering codes based upon the Finite Difference Time Domain Technique (FDTD) first proposed by Yee in 1966. The supplied version of the codes are two versions of our current two dimensional FDTD code set. This manual provides a description of the codes and corresponding results for the default scattering problem. The manual is organized into eleven sections: introduction, Version TEA and TMA code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include files (TEACOM.FOR TMACOM.FOR), a section briefly discussing scattering width computations, a section discussing the scattering results, a sample problem set section, a new problem checklist, references and figure titles.

  14. User's manual for three dimensional FDTD version A code for scattering from frequency-independent dielectric materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain (FDTD) Electromagnetic Scattering Code Version A is a three dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain technique. The supplied version of the code is one version of our current three dimensional FDTD code set. The manual provides a description of the code and the corresponding results for the default scattering problem. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version A code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONA.FOR), a section briefly discussing radar cross section (RCS) computations, a section discussing the scattering results, a sample problem setup section, a new problem checklist, references, and figure titles.

  15. User's manual for three dimensional FDTD version C code for scattering from frequency-independent dielectric and magnetic materials

    NASA Technical Reports Server (NTRS)

    Beggs, John H.; Luebbers, Raymond J.; Kunz, Karl S.

    1992-01-01

    The Penn State Finite Difference Time Domain Electromagnetic Scattering Code Version C is a three-dimensional numerical electromagnetic scattering code based on the Finite Difference Time Domain (FDTD) technique. The supplied version of the code is one version of our current three-dimensional FDTD code set. The manual given here provides a description of the code and corresponding results for several scattering problems. The manual is organized into 14 sections: introduction, description of the FDTD method, operation, resource requirements, Version C code capabilities, a brief description of the default scattering geometry, a brief description of each subroutine, a description of the include file (COMMONC.FOR), a section briefly discussing radar cross section computations, a section discussing some scattering results, a new problem checklist, references, and figure titles.

  16. 78 FR 37571 - Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-21

    ... COMMISSION Certain Opaque Polymers; Institution of Investigation Pursuant to United States Code AGENCY: U.S... importation, and the sale within the United States after importation of certain opaque polymers by reason of... importation, or the sale within the United States after importation of certain opaque polymers that...

  17. A systematic investigation of large-scale diffractive coded aperture designs

    NASA Astrophysics Data System (ADS)

    Gottesman, Stephen R.; Shrekenhamer, Abraham; Isser, Abraham; Gigioli, George

    2012-10-01

    One obstacle to optimizing performance of large-scale coded aperture systems operating in the diffractive regime has been the lack of a robust, rapid, and efficient method for generating diffraction patterns that are projected by the system onto the focal plane. We report on the use of the 'Shrekenhamer Transform' for a systematic investigation of various types of coded aperture designs operating in the diffractive mode. Each design is evaluated in terms of its autocorrelation function for potential use in future imaging applications. The motivation of our study is to gain insight into more efficient optimization methods of image reconstruction algorithms.

  18. Your ticket to independence: a guide to getting your first principal investigator position.

    PubMed

    Káradóttir, Ragnhildur Thóra; Letzkus, Johannes J; Mameli, Manuel; Ribeiro, Carlos

    2015-10-01

    The transition to scientific independence as a principal investigator (PI) can seem like a daunting and mysterious process to postdocs and students - something that many aspire to while at the same time wondering how to achieve this goal and what being a PI really entails. The FENS Kavli Network of Excellence (FKNE) is a group of young faculty who have recently completed this step in various fields of neuroscience across Europe. In a series of opinion pieces from FKNE scholars, we aim to demystify this process and to offer the next generation of up-and-coming PIs some advice and personal perspectives on the transition to independence, starting here with guidance on how to get hired to your first PI position. Rather than providing an exhaustive overview of all facets of the hiring process, we focus on a few key aspects that we have learned to appreciate in the quest for our own labs: What makes a research programme exciting and successful? How can you identify great places to apply to and make sure your application stands out? What are the key objectives for the job talk and the interview? How do you negotiate your position? And finally, how do you decide on a host institute that lets you develop both scientifically and personally in your new role as head of a lab? PMID:26286226

  19. Investigating the Magnetorotational Instability with Dedalus, and Open-Souce Hydrodynamics Code

    SciTech Connect

    Burns, Keaton J; /UC, Berkeley, aff SLAC

    2012-08-31

    The magnetorotational instability is a fluid instability that causes the onset of turbulence in discs with poloidal magnetic fields. It is believed to be an important mechanism in the physics of accretion discs, namely in its ability to transport angular momentum outward. A similar instability arising in systems with a helical magnetic field may be easier to produce in laboratory experiments using liquid sodium, but the applicability of this phenomenon to astrophysical discs is unclear. To explore and compare the properties of these standard and helical magnetorotational instabilities (MRI and HRMI, respectively), magnetohydrodynamic (MHD) capabilities were added to Dedalus, an open-source hydrodynamics simulator. Dedalus is a Python-based pseudospectral code that uses external libraries and parallelization with the goal of achieving speeds competitive with codes implemented in lower-level languages. This paper will outline the MHD equations as implemented in Dedalus, the steps taken to improve the performance of the code, and the status of MRI investigations using Dedalus.

  20. A model to investigate the mechanisms underlying the emergence and development of independent sitting.

    PubMed

    O'Brien, Kathleen M; Zhang, Jing; Walley, Philip R; Rhoads, Jeffrey F; Haddad, Jeffrey M; Claxton, Laura J

    2015-07-01

    When infants first begin to sit independently, they are highly unstable and unable to maintain upright sitting posture for more than a few seconds. Over the course of 3 months, the sitting ability of infants drastically improves. To investigate the mechanisms controlling the development of sitting posture, a single-degree-of-freedom inverted pendulum model was developed. Passive muscle properties were modeled with a stiffness and damping term, while active neurological control was modeled with a time-delayed proportional-integral-derivative (PID) controller. The findings of the simulations suggest that infants primarily utilize passive muscle stiffness to remain upright when they first begin to sit. This passive control mechanism allows the infant to remain upright so that active feedback control mechanisms can develop. The emergence of active control mechanisms allows infants to integrate sensory information into their movements so that they can exhibit more adaptive sitting. PMID:25442426

  1. Investigation of Coded Source Neutron Imaging at the North Carolina State University PULSTAR Reactor

    SciTech Connect

    Xiao, Ziyu; Mishra, Kaushal; Hawari, Ayman; Bingham, Philip R; Bilheux, Hassina Z; Tobin Jr, Kenneth William

    2010-10-01

    A neutron imaging facility is located on beam-tube #5 of the 1-MWth PULSTAR reactor at the North Carolina State University. An investigation has been initiated to explore the application of coded imaging techniques at the facility. Coded imaging uses a mosaic of pinholes to encode an aperture, thus generating an encoded image of the object at the detector. To reconstruct the image recorded by the detector, corresponding decoding patterns are used. The optimized design of coded masks is critical for the performance of this technique and will depend on the characteristics of the imaging beam. In this work, Monte Carlo (MCNP) simulations were utilized to explore the needed modifications to the PULSTAR thermal neutron beam to support coded imaging techniques. In addition, an assessment of coded mask design has been performed. The simulations indicated that a 12 inch single crystal sapphire filter is suited for such an application at the PULSTAR beam in terms of maximizing flux with good neutron-to-gamma ratio. Computational simulations demonstrate the feasibility of correlation reconstruction methods on neutron transmission imaging. A gadolinium aperture with thickness of 500 m was used to construct the mask using a 38 34 URA pattern. A test experiment using such a URA design has been conducted and the point spread function of the system has been measured.

  2. RACE, CODE OF THE STREET, AND VIOLENT DELINQUENCY: A MULTILEVEL INVESTIGATION OF NEIGHBORHOOD STREET CULTURE AND INDIVIDUAL NORMS OF VIOLENCE*

    PubMed Central

    Stewart, Eric A.; Simons, Ronald L.

    2011-01-01

    The study outlined in this article drew on Elijah Anderson’s (1999) code of the street perspective to examine the impact of neighborhood street culture on violent delinquency. Using data from more than 700 African American adolescents, we examined 1) whether neighborhood street culture predicts adolescent violence above and beyond an adolescent’s own street code values and 2) whether neighborhood street culture moderates individual-level street code values on adolescent violence. Consistent with Anderson’s hypotheses, neighborhood street culture significantly predicts violent delinquency independent of individual-level street code effects. Additionally, neighborhood street culture moderates individual-level street code values on violence in neighborhoods where the street culture is widespread. In particular, the effect of street code values on violence is enhanced in neighborhoods where the street culture is endorsed widely. PMID:21666759

  3. ALS beamlines for independent investigators: A summary of the capabilities and characteristics of beamlines at the ALS

    SciTech Connect

    Not Available

    1992-08-01

    There are two mods of conducting research at the ALS: To work as a member of a participating research team (PRT). To work as a member of a participating research team (PRT); to work as an independent investigator; PRTs are responsible for building beamlines, end stations, and, in some cases, insertion devices. Thus, PRT members have privileged access to the ALS. Independent investigators will use beamline facilities made available by PRTs. The purpose of this handbook is to describe these facilities.

  4. Investigate Methods to Decrease Compilation Time-AX-Program Code Group Computer Science R& D Project

    SciTech Connect

    Cottom, T

    2003-06-11

    Large simulation codes can take on the order of hours to compile from scratch. In Kull, which uses generic programming techniques, a significant portion of the time is spent generating and compiling template instantiations. I would like to investigate methods that would decrease the overall compilation time for large codes. These would be methods which could then be applied, hopefully, as standard practice to any large code. Success is measured by the overall decrease in wall clock time a developer spends waiting for an executable. Analyzing the make system of a slow to build project can benefit all developers on the project. Taking the time to analyze the number of processors used over the life of the build and restructuring the system to maximize the parallelization can significantly reduce build times. Distributing the build across multiple machines with the same configuration can increase the number of available processors for building and can help evenly balance the load. Becoming familiar with compiler options can have its benefits as well. The time improvements of the sum can be significant. Initial compilation time for Kull on OSF1 was {approx} 3 hours. Final time on OSF1 after completion is 16 minutes. Initial compilation time for Kull on AIX was {approx} 2 hours. Final time on AIX after completion is 25 minutes. Developers now spend 3 hours less waiting for a Kull executable on OSF1, and 2 hours less on AIX platforms. In the eyes of many Kull code developers, the project was a huge success.

  5. Two mitochondrial genomes from the families Bethylidae and Mutillidae: independent rearrangement of protein-coding genes and higher-level phylogeny of the Hymenoptera.

    PubMed

    Wei, Shu-Jun; Li, Qian; van Achterberg, Kees; Chen, Xue-Xin

    2014-08-01

    In animal mitochondrial genomes, gene arrangements are usually conserved across major lineages but might be rearranged within derived groups, and might provide valuable phylogenetic characters. Here, we sequenced the mitochondrial genomes of Cephalonomia gallicola (Chrysidoidea: Bethylidae) and Wallacidia oculata (Vespoidea: Mutillidae). In Cephalonomia at least 11 tRNA and 2 protein-coding genes were rearranged, which is the first report of protein-coding gene rearrangements in the Aculeata. In the Hymenoptera, three types of protein-coding gene rearrangement events occur, i.e. reversal, transposition and reverse transposition. Venturia (Ichneumonidae) had the greatest number of common intervals with the ancestral gene arrangement pattern, whereas Philotrypesis (Agaonidae) had the fewest. The most similar rearrangement patterns are shared between Nasonia (Pteromalidae) and Philotrypesis, whereas the most differentiated rearrangements occur between Cotesia (Braconidae) and Philotrypesis. It is clear that protein-coding gene rearrangements in the Hymenoptera are evolutionarily independent across the major lineages but are conserved within groups such as the Chalcidoidea. Phylogenetic analyses supported the sister-group relationship of Orrussoidea and Apocrita, Ichneumonoidea and Aculeata, Vespidae and Apoidea, and the paraphyly of Vespoidea. The Evaniomorpha and phylogenetic relationships within Aculeata remain controversial, with discrepancy between analyses using protein-coding and RNA genes. PMID:24704304

  6. Flight Investigation of Prescribed Simultaneous Independent Surface Excitations for Real-Time Parameter Identification

    NASA Technical Reports Server (NTRS)

    Moes, Timothy R.; Smith, Mark S.; Morelli, Eugene A.

    2003-01-01

    Near real-time stability and control derivative extraction is required to support flight demonstration of Intelligent Flight Control System (IFCS) concepts being developed by NASA, academia, and industry. Traditionally, flight maneuvers would be designed and flown to obtain stability and control derivative estimates using a postflight analysis technique. The goal of the IFCS concept is to be able to modify the control laws in real time for an aircraft that has been damaged in flight. In some IFCS implementations, real-time parameter identification (PID) of the stability and control derivatives of the damaged aircraft is necessary for successfully reconfiguring the control system. This report investigates the usefulness of Prescribed Simultaneous Independent Surface Excitations (PreSISE) to provide data for rapidly obtaining estimates of the stability and control derivatives. Flight test data were analyzed using both equation-error and output-error PID techniques. The equation-error PID technique is known as Fourier Transform Regression (FTR) and is a frequency-domain real-time implementation. Selected results were compared with a time-domain output-error technique. The real-time equation-error technique combined with the PreSISE maneuvers provided excellent derivative estimation in the longitudinal axis. However, the PreSISE maneuvers as presently defined were not adequate for accurate estimation of the lateral-directional derivatives.

  7. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry

    SciTech Connect

    Bernal, M. A.; Liendo, J. A.

    2009-02-15

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of {gamma} rays originated from Co{sup 60} and Cs{sup 137} isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs{sup 137}, Al{sub K} and C{sub K} radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field.

  8. An investigation on the capabilities of the PENELOPE MC code in nanodosimetry.

    PubMed

    Bernal, M A; Liendo, J A

    2009-02-01

    The Monte Carlo (MC) method has been widely implemented in studies of radiation effects on human genetic material. Most of these works have used specific-purpose MC codes to simulate radiation transport in condensed media. PENELOPE is one of the general-purpose MC codes that has been used in many applications related to radiation dosimetry. Based on the fact that PENELOPE can carry out event-by-event coupled electron-photon transport simulations following these particles down to energies of the order of few tens of eV, we have decided to investigate the capacities of this code in the field of nanodosimetry. Single and double strand break probabilities due to the direct impact of gamma rays originated from Co60 and Cs137 isotopes and characteristic x-rays, from Al and C K-shells, have been determined by use of PENELOPE. Indirect damage has not been accounted for in this study. A human genetic material geometrical model has been developed, taking into account five organizational levels. In an article by Friedland et al. [Radiat. Environ. Biophys. 38, 39-47 (1999)], a specific-purpose MC code and a very sophisticated DNA geometrical model were used. We have chosen that work as a reference to compare our results. Single and double strand-break probabilities obtained here underestimate those reported by Friedland and co-workers by 20%-76% and 50%-60%, respectively. However, we obtain RBE values for Cs137, AlK and CK radiations in agreement with those reported in previous works [Radiat. Environ. Biophys. 38, 39-47 (1999)] and [Phys. Med. Biol. 53, 233-244 (2008)]. Some enhancements can be incorporated into the PENELOPE code to improve its results in the nanodosimetry field. PMID:19292002

  9. Detailed investigation of Long-Period activity at Campi Flegrei by Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.

    2016-04-01

    This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of

  10. Investigation of in-band transmission of both spectral amplitude coding/optical code division multiple-access and wavelength division multiplexing signals

    NASA Astrophysics Data System (ADS)

    Ashour, Isaac A. M.; Shaari, Sahbudin; Shalaby, Hossam M. H.; Menon, P. Susthitha

    2011-06-01

    The transmission of both optical code division multiple-access (OCDMA) and wavelength division multiplexing (WDM) users on the same band is investigated. Code pulses of spectral amplitude coding (SAC)/optical code division multiple-access (CDMA) are overlaid onto a multichannel WDM system. Notch filters are utilized in order to suppress the WDM interference signals for detection of optical broadband CDMA signals. Modified quadratic congruence (MQC) codes are used as the signature codes for the SAC/OCDMA system. The proposed system is simulated and its performance in terms of both the bit-error rate and Q-factor are determined. In addition, eavesdropper probability of error-free code detection is evaluated. Our results are compared to traditional nonhybrid systems. It is concluded that the proposed hybrid scheme still achieves acceptable performance. In addition, it provides enhanced data confidentiality as compared to the scheme with SAC/OCDMA only. It is also shown that the performance of the proposed system is limited by the interference of the WDM signals. Furthermore, the simulation illustrates the tradeoff between the performance and confidentiality for authorized users.

  11. Analytical investigation of the effects of lateral connections on the accuracy of population coding

    NASA Astrophysics Data System (ADS)

    Oizumi, Masafumi; Miura, Keiji; Okada, Masato

    2010-05-01

    We studied how lateral connections affect the accuracy of a population code by using a model of orientation selectivity in the primary visual cortex. Investigating the effects of lateral connections on population coding is a complex problem because these connections simultaneously change the shape of tuning curves and correlations between neurons. Both of these changes caused by lateral connections have to be taken into consideration to correctly evaluate their effects. We propose a theoretical framework for analytically computing the Fisher information, which measures the accuracy of a population code, in stochastic spiking neuron models with refractory periods. Within our framework, we accurately evaluated both the changes in tuning curves and correlations caused by lateral connections and their effects on the Fisher information. We found that their effects conflicted with each other and the answer to whether or not the lateral connections increased the Fisher information strongly depended on the intrinsic properties of the model neuron. By systematically changing the coupling strengths of excitations and inhibitions, we found the parameter regions of lateral connectivities where sharpening of tuning curves through Mexican-hat connectivities led to an increase in information, which is in contrast to some previous findings.

  12. Investigation of Cool and Hot Executive Function in ODD/CD Independently of ADHD

    ERIC Educational Resources Information Center

    Hobson, Christopher W.; Scott, Stephen; Rubia, Katya

    2011-01-01

    Background: Children with oppositional defiant disorder/conduct disorder (ODD/CD) have shown deficits in "cool" abstract-cognitive, and "hot" reward-related executive function (EF) tasks. However, it is currently unclear to what extent ODD/CD is associated with neuropsychological deficits, independently of attention deficit hyperactivity disorder…

  13. A Longitudinal Investigation of Field Dependence-Independence and the Development of Formal Operational Thought.

    ERIC Educational Resources Information Center

    Flexer, B.K.; Roberge, J.J.

    1983-01-01

    A longitudinal study among American adolescents revealed (1) an insignificant impact of field dependence-independence on the development of formal operational thought; (2) continuous development of combinatorial reasoning and propositional logic abilities, but little increase in comprehension of proportionality; and (3) sex differences in formal…

  14. An Investigation of Independent Child Behavior in the Open Classroom: The Classroom Attitude Observation Schedule (CAOS).

    ERIC Educational Resources Information Center

    Goldupp, Ocea

    The Classroom Attitude Observation Schedule was developed and field tested for study of independent child behavior in the open classroom. Eight Head Start classrooms were used for field testing, six of which used the Tucson Early Education Model curriculum and two of which, for comparison, used local curricula. Procedures involved observing and…

  15. After a Long-Term Placement: Investigating Educational Achievement, Behaviour, and Transition to Independent Living

    ERIC Educational Resources Information Center

    Dumaret, Annick-Camille; Donati, Pascale; Crost, Monique

    2011-01-01

    This study describes the transition towards independent living of 123 former fostered young people reared for long periods in a private French organisation, SOS Children's Villages. Three generations of care leavers were analysed through a postal survey and interviews. Their narratives show typical pathways after leaving care. Two-thirds became…

  16. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    NASA Astrophysics Data System (ADS)

    Pigni, M. T.; Francis, M. W.; Gauld, I. C.

    2015-01-01

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  17. Investigation of inconsistent ENDF/B-VII.1 independent and cumulative fission product yields with proposed revisions

    SciTech Connect

    Pigni, Marco T; Francis, Matthew W; Gauld, Ian C

    2015-01-01

    A recent implementation of ENDF/B-VII. independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear scheme in the decay sub-library that is not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that are incompatible with the cumulative fission yields in the library, and also with experimental measurements. A comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron induced fission for 235,238U and 239,241Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to evaluate the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. An important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library in the case of stable and long-lived cumulative yields due to the inconsistency of ENDF/B-VII.1 fission p;roduct yield and decay data sub-libraries. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  18. Investigation of Inconsistent ENDF/B-VII.1 Independent and Cumulative Fission Product Yields with Proposed Revisions

    SciTech Connect

    Pigni, M.T. Francis, M.W.; Gauld, I.C.

    2015-01-15

    A recent implementation of ENDF/B-VII.1 independent fission product yields and nuclear decay data identified inconsistencies in the data caused by the use of updated nuclear schemes in the decay sub-library that are not reflected in legacy fission product yield data. Recent changes in the decay data sub-library, particularly the delayed neutron branching fractions, result in calculated fission product concentrations that do not agree with the cumulative fission yields in the library as well as with experimental measurements. To address these issues, a comprehensive set of independent fission product yields was generated for thermal and fission spectrum neutron-induced fission for {sup 235,238}U and {sup 239,241}Pu in order to provide a preliminary assessment of the updated fission product yield data consistency. These updated independent fission product yields were utilized in the ORIGEN code to compare the calculated fission product inventories with experimentally measured inventories, with particular attention given to the noble gases. Another important outcome of this work is the development of fission product yield covariance data necessary for fission product uncertainty quantification. The evaluation methodology combines a sequential Bayesian method to guarantee consistency between independent and cumulative yields along with the physical constraints on the independent yields. This work was motivated to improve the performance of the ENDF/B-VII.1 library for stable and long-lived fission products. The revised fission product yields and the new covariance data are proposed as a revision to the fission yield data currently in ENDF/B-VII.1.

  19. AN INVESTIGATION OF NON-INDEPENDENCE OF COMPONENTS OF SCORES ON MULTIPLE-CHOICE TESTS. FINAL REPORT.

    ERIC Educational Resources Information Center

    ZIMMERMAN, DONALD W.; BURKHEIMER, GRAHAM J., JR.

    INVESTIGATION IS CONTINUED INTO VARIOUS EFFECTS OF NON-INDEPENDENT ERROR INTRODUCED INTO MULTIPLE-CHOICE TEST SCORES AS A RESULT OF CHANCE GUESSING SUCCESS. A MODEL IS DEVELOPED IN WHICH THE CONCEPT OF THEORETICAL COMPONENTS OF SCORES IS NOT INTRODUCED AND IN WHICH, THEREFORE, NO ASSUMPTIONS REGARDING ANY RELATIONSHIP BETWEEN SUCH COMPONENTS NEED…

  20. Experimental investigation of mass-dependent and mass-independent fractionation of mercury isotopes

    NASA Astrophysics Data System (ADS)

    Bergquist, B. A.; Blum, J. D.; Marcus, J. W.; Biswas, A.

    2006-12-01

    Mercury is a globally distributed and highly toxic pollutant, the mobility and bioaccumulation of which is highly dependent on its redox cycling. With seven isotopes (including two odd-mass isotopes) and a relative mass difference of 4%, stable isotope fractionation of Hg could be a powerful tool to track and understand Hg cycling in the environment. Ongoing studies of natural mercury isotope variations in ore deposits, hydrothermal fluids, sediments, soils, fish tissues and bacterial cultures have documented a measurable range in Hg isotopes of up to ~5‰ in the ^{202}Hg/^{198}Hg ratio with most samples displaying mass-dependent fractionation. A small, but growing, body of data also suggests that natural samples display mass-independent fractionation of Hg isotopes. In this study, we explore mechanisms that lead to both mass- dependent and mass-independent fractionation of Hg isotopes. Isotope ratios were measured by continuous- flow cold-vapor generation coupled to MC-ICPMS with an external precision of ±0.1‰ (2SD). We observe three distinct types of isotope fractionation for Hg: (1) mass-dependent fractionation, (2) mass- independent fractionation of odd isotopes concurrent with mass-dependent fractionation of even isotopes, and (3) mass-independent fractionation of all Hg isotopes. Reduction of Hg species to Hg(0) vapor is an important mechanism for removal of Hg from aqueous systems into the atmosphere. Reduction of Hg occurs through numerous pathways including photoreduction, abiotic organic reduction, and biological reduction. We find that photoreduction of Hg(II) by natural sunlight leads to mass-independent fractionation of the odd isotopes (^{201}Hg, ^{199}Hg) with several permil deviations from predicted mass dependence, and mass- dependent fractionation of the even isotopes. In contrast, both biological reduction (Kritee et al., 2006) and dark abiotic organically mediated reduction follow mass-dependent fractionation of even and odd isotopes

  1. Establishing evidence of contact transfer in criminal investigation by a novel 'peptide coding' reagent.

    PubMed

    Gooch, James; Koh, Clarissa; Daniel, Barbara; Abbate, Vincenzo; Frascione, Nunzianda

    2015-11-01

    Forensic investigators are often faced with the challenge of forming a logical association between a suspect, object or location and a particular crime. This article documents the development of a novel reagent that may be used to establish evidence of physical contact between items and individuals as a result of criminal activity. Consisting of a fluorescent compound suspended within an oil-based medium, this reagent utilises the addition of short customisable peptide molecules of a specific known sequence as unique owner-registered 'codes'. This product may be applied onto goods or premises of criminal interest and subsequently transferred onto objects that contact target surfaces. Visualisation of the reagent is then achieved via fluorophore excitation, subsequently allowing rapid peptide recovery and analysis. Simple liquid-liquid extraction methods were devised to rapidly isolate the peptide from other reagent components prior to analysis by ESI-MS. PMID:26452928

  2. Further Investigation of Acoustic Propagation Codes for Three-Dimensional Geometries

    NASA Technical Reports Server (NTRS)

    Nark, Douglas M.; Watson, Willie R.; Jones, Michael G.

    2006-01-01

    The ability to predict fan noise within complex three-dimensional aircraft engine nacelle geometries is a valuable tool in designing and assessing low-noise concepts. This work begins a systematic study to identify the areas of the design space in which propagation codes of varying fidelity may be used effectively to provide efficient design and assessment. An efficient lower-fidelity code is used in conjunction with two higher-fidelity, more computationally intensive methods to solve benchmark problems of increasing complexity. The codes represent a small sampling of the current propagation codes available or under development. Results of this initial study indicate that the lower-fidelity code provides satisfactory results for cases involving low to moderate attenuation rates, whereas, the two higher-fidelity codes perform well across the range of problems.

  3. Culture-dependent and -independent methods to investigate the microbial ecology of Italian fermented sausages.

    PubMed

    Rantsiou, Kalliopi; Urso, Rosalinda; Iacumin, Lucilla; Cantoni, Carlo; Cattaneo, Patrizia; Comi, Giuseppe; Cocolin, Luca

    2005-04-01

    In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to the particular transformation process. Yeast counts were higher than the detection limit (> 100 CFU/g) in only one fermented sausage. Analysis of the denaturing gradient gel electrophoresis (DGGE) patterns and sequencing of the bands allowed profiling of the microbial populations present in the sausages during fermentation. The bacterial ecology was mainly characterized by the stable presence of Lactobacillus curvatus and Lactobacillus sakei, but Lactobacillus paracasei was also repeatedly detected. An important piece of evidence was the presence of Lactococcus garvieae, which clearly contributed in two fermentations. Several species of Staphylococcus were also detected. Regarding other bacterial groups, Bacillus sp., Ruminococcus sp., and Macrococcus caseolyticus were also identified at the beginning of the transformations. In addition, yeast species belonging to Debaryomyces hansenii, several Candida species, and Willopsis saturnus were observed in the DGGE gels. Finally, cluster analysis of the bacterial and yeast DGGE profiles highlighted the uniqueness of the fermentation processes studied. PMID:15812029

  4. Culture-Dependent and -Independent Methods To Investigate the Microbial Ecology of Italian Fermented Sausages

    PubMed Central

    Rantsiou, Kalliopi; Urso, Rosalinda; Iacumin, Lucilla; Cantoni, Carlo; Cattaneo, Patrizia; Comi, Giuseppe; Cocolin, Luca

    2005-01-01

    In this study, the microbial ecology of three naturally fermented sausages produced in northeast Italy was studied by culture-dependent and -independent methods. By plating analysis, the predominance of lactic acid bacteria populations was pointed out, as well as the importance of coagulase-negative cocci. Also in the case of one fermentation, the fecal enterocci reached significant counts, highlighting their contribution to the particular transformation process. Yeast counts were higher than the detection limit (>100 CFU/g) in only one fermented sausage. Analysis of the denaturing gradient gel electrophoresis (DGGE) patterns and sequencing of the bands allowed profiling of the microbial populations present in the sausages during fermentation. The bacterial ecology was mainly characterized by the stable presence of Lactobacillus curvatus and Lactobacillus sakei, but Lactobacillus paracasei was also repeatedly detected. An important piece of evidence was the presence of Lactococcus garvieae, which clearly contributed in two fermentations. Several species of Staphylococcus were also detected. Regarding other bacterial groups, Bacillus sp., Ruminococcus sp., and Macrococcus caseolyticus were also identified at the beginning of the transformations. In addition, yeast species belonging to Debaryomyces hansenii, several Candida species, and Willopsis saturnus were observed in the DGGE gels. Finally, cluster analysis of the bacterial and yeast DGGE profiles highlighted the uniqueness of the fermentation processes studied. PMID:15812029

  5. Nye County Nuclear Waste Repository Project Office independent scientific investigations program annual report, May 1997--April 1998

    SciTech Connect

    1998-07-01

    This annual summary report, prepared by the Nye County Nuclear Waste Repository Project Office (NWRPO), summarizes the activities that were performed during the period from May 1, 1997 to April 30, 1998. These activities were conducted in support of the Independent Scientific Investigation Program (ISIP) of Nye County at the Yucca Mountain Site (YMS). The Nye County NWRPO is responsible for protecting the health and safety of the Nye County residents. NWRPO`s on-site representative is responsible for designing and implementing the Independent Scientific Investigation Program (ISIP). Major objectives of the ISIP include: Investigating key issues related to conceptual design and performance of the repository that can have major impact on human health, safety, and the environment; identifying areas not being addressed adequately by the Department of Energy (DOE). Nye County has identified several key scientific issues of concern that may affect repository design and performance which were not being adequately addressed by DOE. Nye County has been conducting its own independent study to evaluate the significance of these issues. This report summarizes the results of monitoring from two boreholes and the Exploratory Studies Facility (ESF) tunnel that have been instrumented by Nye County since March and April of 1995. The preliminary data and interpretations presented in this report do not constitute and should not be considered as the official position of Nye County. The ISIP presently includes borehole and tunnel instrumentation, monitoring, data analysis, and numerical modeling activities to address the concerns of Nye County.

  6. A Monte Carlo Investigation of the Analysis of Variance Applied to Non-Independent Bernoulli Variates.

    ERIC Educational Resources Information Center

    Draper, John F., Jr.

    The applicability of the Analysis of Variance, ANOVA, procedures to the analysis of dichotomous repeated measure data is described. The design models for which data were simulated in this investigation were chosen to represent simple cases of two experimental situations: situation one, in which subjects' responses to a single randomly selected set…

  7. A model-independent investigation on quasi-degenerate neutrino mass models and their significance

    NASA Astrophysics Data System (ADS)

    Roy, Subhankar; Singh, N. Nimai

    2013-12-01

    The prediction of possible hierarchy of neutrino masses mostly depends on the model chosen. Dissociating the μ-τ interchange symmetry from discrete flavor symmetry based models, makes the neutrino mass matrix less predictive and motivates one to seek the answer from different phenomenological frameworks. This insists on proper parametrization of the neutrino mass matrices concerning individual hierarchies. In this work, an attempt has been made to study the six different cases of quasi-degenerate (QDN) neutrino models with mass matrices, mLLν parametrized with two free parameters (α,η), standard Wolfenstein parameter (λ) and input mass scale, m0˜0.08 eV. We start with a μ-τ symmetric neutrino mass matrix followed by a correction from charged lepton sector. The parametrization emphasizes on the existence of four independent texture zero building blocks common to all the QDN models under μ-τ symmetric framework and is found to be invariant under any choice of solar angle. In our parametrization, solar angle is controlled from neutrino sector whereas the charged lepton sector drives the reactor and atmospheric mixing angles. The individual models are tested in the framework of oscillation experiments, cosmological observation and future experiments involving β-decay and 0νββ experiments, and any reason to discard the QDN mass models with relatively lower mass is unfounded. Although the QDNH-Type IA model shows strong preference for sin2θ12=0.32, yet this is not sufficient to rule out the other models. The present work leaves a scope to extend the search of most favorable QDN mass model from observed baryon asymmetry of the Universe.

  8. Culture-Independent Investigation of the Microbiome Associated with the Nematode Acrobeloides maximus

    PubMed Central

    Baquiran, Jean-Paul; Thater, Brian; Sedky, Sammy; De Ley, Paul; Crowley, David; Orwin, Paul M.

    2013-01-01

    Background Symbioses between metazoans and microbes are widespread and vital to many ecosystems. Recent work with several nematode species has suggested that strong associations with microbial symbionts may also be common among members of this phylu. In this work we explore possible symbiosis between bacteria and the free living soil bacteriovorous nematode Acrobeloides maximus. Methodology We used a soil microcosm approach to expose A. maximus populations grown monoxenically on RFP labeled Escherichia coli in a soil slurry. Worms were recovered by density gradient separation and examined using both culture-independent and isolation methods. A 16S rRNA gene survey of the worm-associated bacteria was compared to the soil and to a similar analysis using Caenorhabditis elegans N2. Recovered A. maximus populations were maintained on cholesterol agar and sampled to examine the population dynamics of the microbiome. Results A consistent core microbiome was extracted from A. maximus that differed from those in the bulk soil or the C. elegans associated set. Three genera, Ochrobactrum, Pedobacter, and Chitinophaga, were identified at high levels only in the A. maximus populations, which were less diverse than the assemblage associated with C. elegans. Putative symbiont populations were maintained for at least 4 months post inoculation, although the levels decreased as the culture aged. Fluorescence in situ hybridization (FISH) using probes specific for Ochrobactrum and Pedobacter stained bacterial cells in formaldehyde fixed nematode guts. Conclusions Three microorganisms were repeatedly observed in association with Acrobeloides maximus when recovered from soil microcosms. We isolated several Ochrobactrum sp. and Pedobacter sp., and demonstrated that they inhabit the nematode gut by FISH. Although their role in A. maximus is not resolved, we propose possible mutualistic roles for these bacteria in protection of the host against pathogens and facilitating enzymatic

  9. Investigating the Use of Quick Response Codes in the Gross Anatomy Laboratory

    ERIC Educational Resources Information Center

    Traser, Courtney J.; Hoffman, Leslie A.; Seifert, Mark F.; Wilson, Adam B.

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student…

  10. Investigating the use of quick response codes in the gross anatomy laboratory.

    PubMed

    Traser, Courtney J; Hoffman, Leslie A; Seifert, Mark F; Wilson, Adam B

    2015-01-01

    The use of quick response (QR) codes within undergraduate university courses is on the rise, yet literature concerning their use in medical education is scant. This study examined student perceptions on the usefulness of QR codes as learning aids in a medical gross anatomy course, statistically analyzed whether this learning aid impacted student performance, and evaluated whether performance could be explained by the frequency of QR code usage. Question prompts and QR codes tagged on cadaveric specimens and models were available for four weeks as learning aids to medical (n = 155) and doctor of physical therapy (n = 39) students. Each QR code provided answers to posed questions in the form of embedded text or hyperlinked web pages. Students' perceptions were gathered using a formative questionnaire and practical examination scores were used to assess potential gains in student achievement. Overall, students responded positively to the use of QR codes in the gross anatomy laboratory as 89% (57/64) agreed the codes augmented their learning of anatomy. The users' most noticeable objection to using QR codes was the reluctance to bring their smartphones into the gross anatomy laboratory. A comparison between the performance of QR code users and non-users was found to be nonsignificant (P = 0.113), and no significant gains in performance (P = 0.302) were observed after the intervention. Learners welcomed the implementation of QR code technology in the gross anatomy laboratory, yet this intervention had no apparent effect on practical examination performance. PMID:25288343

  11. Towards investigation of evolution of dynamical systems with independence of time accuracy: more classes of systems

    NASA Astrophysics Data System (ADS)

    Gurzadyan, V. G.; Kocharyan, A. A.

    2015-07-01

    The recently developed method (Paper 1) enabling one to investigate the evolution of dynamical systems with an accuracy not dependent on time is developed further. The classes of dynamical systems which can be studied by that method are much extended, now including systems that are: (1) non-Hamiltonian, conservative; (2) Hamiltonian with time-dependent perturbation; (3) non-conservative (with dissipation). These systems cover various types of N-body gravitating systems of astrophysical and cosmological interest, such as the orbital evolution of planets, minor planets, artificial satellites due to tidal, non-tidal perturbations and thermal thrust, evolving close binary stellar systems, and the dynamics of accretion disks.

  12. Investigation of low temperature solid oxide fuel cells for air-independent UUV applications

    NASA Astrophysics Data System (ADS)

    Moton, Jennie Mariko

    Unmanned underwater vehicles (UUVs) will benefit greatly from high energy density (> 500 Wh/L) power systems utilizing high-energy-density fuels and air-independent oxidizers. Current battery-based systems have limited energy densities (< 400 Wh/L), which motivate development of alternative power systems such as solid oxide fuel cells (SOFCs). SOFC-based power systems have the potential to achieve the required UUV energy densities, and the current study explores how SOFCs based on gadolinia-doped ceria (GDC) electrolytes with operating temperatures of 650°C and lower may operate in the unique environments of a promising UUV power plant. The plant would contain a H 2O2 decomposition reactor to supply humidified O2 to the SOFC cathode and exothermic aluminum/H2O combustor to provide heated humidified H2 fuel to the anode. To characterize low-temperature SOFC performance with these unique O2 and H2 source, SOFC button cells based on nickel/GDC (Gd0.1Ce0.9O 1.95) anodes, GDC electrolytes, and lanthanum strontium cobalt ferrite (La0.6Sr0.4Co0.2Fe0.8O3-δ or LSCF)/GDC cathodes were fabricated and tested for performance and stability with humidity on both the anode and the cathode. Cells were also tested with various reactant concentrations of H2 and O2 to simulate gas depletion down the channel of an SOFC stack. Results showed that anode performance depended primarily on fuel concentration and less on the concentration of the associated increase in product H2O. O 2 depletion with humidified cathode flows also caused significant loss in cell current density at a given voltage. With the humidified flows in either the anode or cathode, stability tests of the button cells at 650 °C showed stable voltage is maintained at low operating current (0.17 A/cm2) at up to 50 % by mole H2O, but at higher current densities (0.34 A/cm2), irreversible voltage degradation occurred at rates of 0.8-3.7 mV/hour depending on exposure time. From these button cell results, estimated average

  13. Investigation of ASME code: Section 3, Subsection NB, Suggested revisions: Final report

    SciTech Connect

    Love, J.E.

    1986-11-01

    The nuclear industry has become increasingly aware that engineering standards documents include requirements which are technically inconsistent and in many cases excessively costly to apply. As a result of these concerns, a study of Subsection NB of Section III of the ASME Code was undertaken. Subsection NB addresses Code requirements for Class 1 Components for nuclear power plants. The study was restricted to one subsection with the intent of discovering the extent of the difficulties in the Code. The areas of primary concern were those requirements which have a direct impact on design, fabrication, and examination. Subsection NB was carefully reviewed for inconsistencies and unworkable criteria. Seventy-seven such deficiencies were identified. A preliminary recommendation was prepared for each inconsistency or unworkable criterion. An overall critique of the Subsection was written, and suggestions were made for addressing the problems as part of a major Code revision.

  14. Investigating the structure preserving encryption of high efficiency video coding (HEVC)

    NASA Astrophysics Data System (ADS)

    Shahid, Zafar; Puech, William

    2013-02-01

    This paper presents a novel method for the real-time protection of new emerging High Efficiency Video Coding (HEVC) standard. Structure preserving selective encryption is being performed in CABAC entropy coding module of HEVC, which is significantly different from CABAC entropy coding of H.264/AVC. In CABAC of HEVC, exponential Golomb coding is replaced by truncated Rice (TR) up to a specific value for binarization of transform coefficients. Selective encryption is performed using AES cipher in cipher feedback mode on a plaintext of binstrings in a context aware manner. The encrypted bitstream has exactly the same bit-rate and is format complaint. Experimental evaluation and security analysis of the proposed algorithm is performed on several benchmark video sequences containing different combinations of motion, texture and objects.

  15. Computer models to support investigations of surface subsidence and associated ground motion induced by underground coal gasification. [STEALTH Codes

    SciTech Connect

    Langland, R.T.; Trent, B.C.

    1981-01-01

    Two computer codes compare surface subsidence induced by underground coal gasification at Hoe Creek, Wyoming, and Centralia, Washington. Calculations with the STEALTH explicit finite-difference code are shown to match equivalent, implicit finite-element method solutions for the removal of underground material. Effects of removing roof material, varying elastic constants, investigating thermal shrinkage, and burning multiple coal seams are studied. A coupled, finite-difference continuum rigid-block caving code is used to model underground opening behavior. Numerical techniques agree qualitatively with empirical studies but, so far, underpredict ground surface displacement. The two methods, numerical and empirical, are most effective when used together. It is recommended that the thermal characteristics of coal measure rock be investigated and that additional calculations be carried out to longer times so that cooling influences can be modeled.

  16. Training camp: The quest to become a new National Institutes of Health (NIH)-funded independent investigator

    NASA Astrophysics Data System (ADS)

    Sklare, Daniel A.

    2003-04-01

    This presentation will provide information on the research training and career development programs of the National Institute on Deafness and Other Communication Disorders (NIDCD). The predoctoral and postdoctoral fellowship (F30, F31, F32) programs and the research career development awards for clinically trained individuals (K08/K23) and for individuals trained in the quantitative sciences and in engineering (K25) will be highlighted. In addition, the role of the NIDCD Small Grant (R03) in transitioning postdoctoral-level investigators to research independence will be underscored.

  17. Investigation of Different Constituent Encoders in a Turbo-code Scheme for Reduced Decoder Complexity

    NASA Technical Reports Server (NTRS)

    Kwatra, S. C.

    1998-01-01

    A large number of papers have been published attempting to give some analytical basis for the performance of Turbo-codes. It has been shown that performance improves with increased interleaver length. Also procedures have been given to pick the best constituent recursive systematic convolutional codes (RSCC's). However testing by computer simulation is still required to verify these results. This thesis begins by describing the encoding and decoding schemes used. Next simulation results on several memory 4 RSCC's are shown. It is found that the best BER performance at low E(sub b)/N(sub o) is not given by the RSCC's that were found using the analytic techniques given so far. Next the results are given from simulations using a smaller memory RSCC for one of the constituent encoders. Significant reduction in decoding complexity is obtained with minimal loss in performance. Simulation results are then given for a rate 1/3 Turbo-code with the result that this code performed as well as a rate 1/2 Turbo-code as measured by the distance from their respective Shannon limits. Finally the results of simulations where an inaccurate noise variance measurement was used are given. From this it was observed that Turbo-decoding is fairly stable with regard to noise variance measurement.

  18. Luminal long non-coding RNAs regulated by estrogen receptor alpha in a ligand-independent manner show functional roles in breast cancer

    PubMed Central

    Miano, Valentina; Ferrero, Giulio; Reineri, Stefania; Caizzi, Livia; Annaratone, Laura; Ricci, Laura; Cutrupi, Santina; Castellano, Isabella; Cordero, Francesca; De Bortoli, Michele

    2016-01-01

    Estrogen Receptor alpha (ERα) activation by estrogenic hormones induces luminal breast cancer cell proliferation. However, ERα plays also important hormone-independent functions to maintain breast tumor cells epithelial phenotype. We reported previously by RNA-Seq that in MCF-7 cells in absence of hormones ERα down-regulation changes the expression of several genes linked to cellular development, representing a specific subset of estrogen-induced genes. Here, we report regulation of long non-coding RNAs from the same experimental settings. A list of 133 Apo-ERα-Regulated lncRNAs (AER-lncRNAs) was identified and extensively characterized using published data from cancer cell lines and tumor tissues, or experiments on MCF-7 cells. For several features, we ran validation using cell cultures or fresh tumor biopsies. AER-lncRNAs represent a specific subset, only marginally overlapping estrogen-induced transcripts, whose expression is largely restricted to luminal cells and which is able to perfectly classify breast tumor subtypes. The most abundant AER-lncRNA, DSCAM-AS1, is expressed in ERα+ breast carcinoma, but not in pre-neoplastic lesions, and correlates inversely with EMT markers. Down-regulation of DSCAM-AS1 recapitulated, in part, the effect of silencing ERα, i.e. growth arrest and induction of EMT markers. In conclusion, we report an ERα-dependent lncRNA set representing a novel luminal signature in breast cancer cells. PMID:26621851

  19. Dimensionality of ICA in resting-state fMRI investigated by feature optimized classification of independent components with SVM

    PubMed Central

    Wang, Yanlu; Li, Tie-Qiang

    2015-01-01

    Different machine learning algorithms have recently been used for assisting automated classification of independent component analysis (ICA) results from resting-state fMRI data. The success of this approach relies on identification of artifact components and meaningful functional networks. A limiting factor of ICA is the uncertainty of the number of independent components (NIC). We aim to develop a framework based on support vector machines (SVM) and optimized feature-selection for automated classification of independent components (ICs) and use the framework to investigate the effects of input NIC on the ICA results. Seven different resting-state fMRI datasets were studied. 18 features were devised by mimicking the empirical criteria for manual evaluation. The five most significant (p < 0.01) features were identified by general linear modeling and used to generate a classification model for the framework. This feature-optimized classification of ICs with SVM (FOCIS) framework was used to classify both group and single subject ICA results. The classification results obtained using FOCIS and previously published FSL-FIX were compared against manually evaluated results. On average the false negative rate in identifying artifact contaminated ICs for FOCIS and FSL-FIX were 98.27 and 92.34%, respectively. The number of artifact and functional network components increased almost linearly with the input NIC. Through tracking, we demonstrate that incrementing NIC affects most ICs when NIC < 33, whereas only a few limited ICs are affected by direct splitting when NIC is incremented beyond NIC > 40. For a given IC, its changes with increasing NIC are individually specific irrespective whether the component is a potential resting-state functional network or an artifact component. Using FOCIS, we investigated experimentally the ICA dimensionality of resting-state fMRI datasets and found that the input NIC can critically affect the ICA results of resting-state fMRI data. PMID

  20. Simulative Investigation on Spectral Efficiency of Unipolar Codes based OCDMA System using Importance Sampling Technique

    NASA Astrophysics Data System (ADS)

    Farhat, A.; Menif, M.; Rezig, H.

    2013-09-01

    This paper analyses the spectral efficiency of Optical Code Division Multiple Access (OCDMA) system using Importance Sampling (IS) technique. We consider three configurations of OCDMA system namely Direct Sequence (DS), Spectral Amplitude Coding (SAC) and Fast Frequency Hopping (FFH) that exploits the Fiber Bragg Gratings (FBG) based encoder/decoder. We evaluate the spectral efficiency of the considered system by taking into consideration the effect of different families of unipolar codes for both coherent and incoherent sources. The results show that the spectral efficiency of OCDMA system with coherent source is higher than the incoherent case. We demonstrate also that DS-OCDMA outperforms both others in terms of spectral efficiency in all conditions.

  1. THE CODE OF THE STREET AND INMATE VIOLENCE: INVESTIGATING THE SALIENCE OF IMPORTED BELIEF SYSTEMS*

    PubMed Central

    MEARS, DANIEL P.; STEWART, ERIC A.; SIENNICK, SONJA E.; SIMONS, RONALD L.

    2013-01-01

    Scholars have long argued that inmate behaviors stem in part from cultural belief systems that they “import” with them into incarcerative settings. Even so, few empirical assessments have tested this argument directly. Drawing on theoretical accounts of one such set of beliefs—the code of the street—and on importation theory, we hypothesize that individuals who adhere more strongly to the street code will be more likely, once incarcerated, to engage in violent behavior and that this effect will be amplified by such incarceration experiences as disciplinary sanctions and gang involvement, as well as the lack of educational programming, religious programming, and family support. We test these hypotheses using unique data that include measures of the street code belief system and incarceration experiences. The results support the argument that the code of the street belief system affects inmate violence and that the effect is more pronounced among inmates who lack family support, experience disciplinary sanctions, and are gang involved. Implications of these findings are discussed. PMID:24068837

  2. Investigating the Semantic Interoperability of Laboratory Data Exchanged Using LOINC Codes in Three Large Institutions

    PubMed Central

    Lin, Ming–Chin; Vreeman, Daniel J.; Huff, Stanley M.

    2011-01-01

    LOINC codes are seeing increased use in many organizations. In this study, we examined the barriers to semantic interoperability that still exist in electronic data exchange of laboratory results even when LOINC codes are being used as the observation identifiers. We analyzed semantic interoperability of laboratory data exchanged using LOINC codes in three large institutions. To simplify the analytic process, we divided the laboratory data into quantitative and non-quantitative tests. The analysis revealed many inconsistencies even when LOINC codes are used to exchange laboratory data. For quantitative tests, the most frequent problems were inconsistencies in the use of units of measure: variations in the strings used to represent units (unrecognized synonyms), use of units that result in different magnitudes of the numeric quantity, and missing units of measure. For non-quantitative tests, the most frequent problems were acronyms/synonyms, different classes of elements in enumerated lists, and the use of free text. Our findings highlight the limitations of interoperability in current laboratory reporting. PMID:22195138

  3. Write to Read: Investigating the Reading-Writing Relationship of Code-Level Early Literacy Skills

    ERIC Educational Resources Information Center

    Jones, Cindy D.; Reutzel, D. Ray

    2015-01-01

    The purpose of this study was to examine whether the code-related features used in current methods of writing instruction in kindergarten classrooms transfer reading outcomes for kindergarten students. We randomly assigned kindergarten students to 3 instructional groups: a writing workshop group, an interactive writing group, and a control group.…

  4. Investigation of independence in inter-animal tumor-type occurrences within the NTP rodent-bioassay database

    SciTech Connect

    Bogen, K.T.; Seilkop, S.

    1993-05-01

    Statistically significant elevation in tumor incidence at multiple histologically distinct sites is occasionally observed among rodent bioassays of chemically induced carcinogenesis. If such data are to be relied on (as they have, e.g., by the US EPA) for quantitative cancer potency assessment, their proper analysis requires a knowledge of the extent to which multiple tumor-type occurrences are independent or uncorrelated within individual bioassay animals. Although difficult to assess in a statistically rigorous fashion, a few significant associations among tumor-type occurrences in rodent bioassays have been reported. However, no comprehensive studies of animal-specific tumor-type occurrences at death or sacrifice have been conducted using the extensive set of available NTP rodent-bioassay data, on which most cancer-potency assessment for environmental chemicals is currently based. This report presents the results of such an analysis conducted on behalf of the National Research Council`s Committee on Risk Assessment for Hazardous Air Pollutants. Tumor-type associations among individual animals were examined for {approximately}2500 to 3000 control and {approximately}200 to 600 treated animals using pathology data from 62 B6C3F1 mouse studies and 61 F/344N rat studies obtained from a readily available subset of the NTP carcinogenesis bioassay database. No evidence was found for any large correlation in either the onset probability or the prevalence-at-death or sacrifice of any tumor-type pair investigated in control and treated rats and niece, although a few of the small correlations present were statistically significant. Tumor-type occurrences were in most cases nearly independent, and departures from independence, where they did occur, were small. This finding is qualified in that tumor-type onset correlations were measured only indirectly, given the limited nature of the data analyzed.

  5. Preliminary investigation of acoustic bar codes for short-range underwater communications

    NASA Astrophysics Data System (ADS)

    Jones, Dennis F.

    2005-09-01

    In March 2005, underwater acoustic communications experiments were carried out from the DRDC Atlantic research vessel CFAV QUEST. A battery-operated BATS20 transmitter and a broadband barrel-stave flextensional transducer were used to broadcast noise containing acoustic bar code (ABC) information. The ABCs are silent frequency bands of fixed duration that resemble retail bar codes when viewed in a spectrogram. Two sites were selected for the experiments. The first was a shallow-water area west of the Berry Islands in the Bahamas, and the second was a deep-water site south of the Western Bank on the Scotian Shelf. Two receiver systems were deployed; autonomous, variable-buoyancy Stealth Buoys resting on the bottom at the shallow site, and drifting AN/SSQ-53F sonobuoys fitted with GPS at the deep site. Results from these experiments will be presented and future work will be discussed.

  6. Investigating the accuracy of the FLUKA code for transport of therapeutic ion beams in matter.

    PubMed

    Sommerer, Florian; Parodi, Katia; Ferrari, Alfredo; Poljanc, Karin; Enghardt, Wolfgang; Aiginger, Hannes

    2006-09-01

    In-beam positron emission tomography (PET) is currently used for monitoring the dose delivery at the heavy ion therapy facility at GSI Darmstadt. The method is based on the fact that carbon ions produce positron emitting isotopes in fragmentation reactions with the atomic nuclei of the tissue. The relation between dose and beta(+)-activity is not straightforward. Hence it is not possible to infer the delivered dose directly from the PET distribution. To overcome this problem and enable therapy monitoring, beta(+)-distributions are simulated on the basis of the treatment plan and compared with the measured ones. Following the positive clinical impact, it is planned to apply the method at future ion therapy facilities, where beams from protons up to oxygen nuclei will be available. A simulation code capable of handling all these ions and predicting the irradiation-induced beta(+)-activity distributions is desirable. An established and general purpose radiation transport code is preferred. FLUKA is a candidate for such a code. For application to in-beam PET therapy monitoring, the code has to model with high accuracy both the electromagnetic and nuclear interactions responsible for dose deposition and beta(+)-activity production, respectively. In this work, the electromagnetic interaction in FLUKA was adjusted to reproduce the same particle range as from the experimentally validated treatment planning software TRiP, used at GSI. Furthermore, projectile fragmentation spectra in water targets have been studied in comparison to available experimental data. Finally, cross sections for the production of the most abundant fragments have been calculated and compared to values found in the literature. PMID:16912388

  7. PNAS Plus: Independent evaluation of conflicting microspherule results from different investigations of the Younger Dryas impact hypothesis

    NASA Astrophysics Data System (ADS)

    LeCompte, Malcolm A.; Goodyear, Albert C.; Demitroff, Mark N.; Batchelor, Dale; Vogel, Edward K.; Mooney, Charles; Rock, Barrett N.; Seidel, Alfred W.

    2012-10-01

    Firestone et al. sampled sedimentary sequences at many sites across North America, Europe, and Asia [Firestone RB, et al. (2007) Proc Natl Acad Sci USA 106:16016-16021]. In sediments dated to the Younger Dryas onset or Boundary (YDB) approximately 12,900 calendar years ago, Firestone et al. reported discovery of markers, including nanodiamonds, aciniform soot, high-temperature melt-glass, and magnetic microspherules attributed to cosmic impacts/airbursts. The microspherules were explained as either cosmic material ablation or terrestrial ejecta from a hypothesized North American impact that initiated the abrupt Younger Dryas cooling, contributed to megafaunal extinctions, and triggered human cultural shifts and population declines. A number of independent groups have confirmed the presence of YDB spherules, but two have not. One of them [Surovell TA, et al. (2009) Proc Natl Acad Sci USA 104:18155-18158] collected and analyzed samples from seven YDB sites, purportedly using the same protocol as Firestone et al., but did not find a single spherule in YDB sediments at two previously reported sites. To examine this discrepancy, we conducted an independent blind investigation of two sites common to both studies, and a third site investigated only by Surovell et al. We found abundant YDB microspherules at all three widely separated sites consistent with the results of Firestone et al. and conclude that the analytical protocol employed by Surovell et al. deviated significantly from that of Firestone et al. Morphological and geochemical analyses of YDB spherules suggest they are not cosmic, volcanic, authigenic, or anthropogenic in origin. Instead, they appear to have formed from abrupt melting and quenching of terrestrial materials.

  8. Safety Related Investigations of the VVER-1000 Reactor Type by the Coupled Code System TRACE/PARCS

    NASA Astrophysics Data System (ADS)

    Jaeger, Wadim; Espinoza, Victor Hugo Sánchez; Lischke, Wolfgang

    This study was performed at the Institute of Reactor Safety at the Forschungszentrum Karlsruhe. It is embedded in the ongoing investigations of the international code assessment and maintenance program (CAMP) for qualification and validation of system codes like TRACE(1) and PARCS(2). The chosen reactor type used to validate these two codes was the Russian designed VVER-1000 because the OECD/NEA VVER-1000 Coolant Transient Benchmark Phase 2(3) includes detailed information of the Bulgarian nuclear power plant (NPP) Kozloduy unit 6. The post-test investigations of a coolant mixing experiment have shown that the predicted parameters (coolant temperature, pressure drop, etc.) are in good agreement with the measured data. The coolant mixing pattern, especially in the downcomer, has been also reproduced quiet well by TRACE. The coupled code system TRACE/PARCS which was applied on a postulated main steam line break (MSLB) provided good results compared to reference values and the ones of other participants of the benchmark. The results show that the developed three-dimensional nodalization of the reactor pressure vessel (RPV) is appropriate to describe the coolant mixing phenomena in the downcomer and the lower plenum of a VVER-1000 reactor. This phenomenon is a key issue for investigations of MSLB transient where the thermal hydraulics and the core neutronics are strongly linked. The simulation of the RPV and core behavior for postulated transients using the validated 3D TRACE RPV model, taking into account boundary conditions at vessel in- and outlet, indicates that the results are physically sound and in good agreement to other participant's results.

  9. Flight investigation of cockpit-displayed traffic information utilizing coded symbology in an advanced operational environment

    NASA Technical Reports Server (NTRS)

    Abbott, T. S.; Moen, G. C.; Person, L. H., Jr.; Keyser, G. L., Jr.; Yenni, K. R.; Garren, J. F., Jr.

    1980-01-01

    Traffic symbology was encoded to provide additional information concerning the traffic, which was displayed on the pilot's electronic horizontal situation indicators (EHSI). A research airplane representing an advanced operational environment was used to assess the benefit of coded traffic symbology in a realistic work-load environment. Traffic scenarios, involving both conflict-free and conflict situations, were employed. Subjective pilot commentary was obtained through the use of a questionnaire and extensive pilot debriefings. These results grouped conveniently under two categories: display factors and task performance. A major item under the display factor category was the problem of display clutter. The primary contributors to clutter were the use of large map-scale factors, the use of traffic data blocks, and the presentation of more than a few airplanes. In terms of task performance, the cockpit-displayed traffic information was found to provide excellent overall situation awareness. Additionally, mile separation prescribed during these tests.

  10. Investigation on Coding Method of Dental X-ray Image for Integrated Hospital Information System

    NASA Astrophysics Data System (ADS)

    Seki, Takashi; Hamamoto, Kazuhiko

    Recently, medical information system in dental field goes into digital system. In the system, X-ray image can be taken in digital modality and input to the system directly. Consequently, it is easy to combine the image data with alpha-numerical data which are stored in the conventional medical information system. It is useful to manipulate alpha-numerical data and image data simultaneously. The purpose of this research is to develop a new coding method for dental X-ray image. The method enables to reduce a disk space to store the images and transmit the images through Internet or LAN lightly. I attempt to apply multi-resolution analysis (wavelet transform) to accomplish the purpose. Proposed method achieves low bit-rate compared with conventional method.

  11. MDMC2: A molecular dynamics code for investigating the fragmentation dynamics of multiply charged clusters

    NASA Astrophysics Data System (ADS)

    Bonhommeau, David A.; Gaigeot, Marie-Pierre

    2014-02-01

    MDMC2 is a parallel code for performing molecular dynamics simulations on multiply charged clusters. It is a valuable complement to MCMC2, a Monte Carlo program devoted to Monte Carlo simulations of multiply charged clusters in the NVT ensemble (Bonhommeau and Gaigeot, 2013). Both MCMC2 and MDMC2 codes employ a mesoscopic coarse-grained simplified representation of the clusters (or droplets): these clusters are composed of neutral and charged spherical particles/grains that may be polarisable. One grain can be either neutral or charged. The interaction potential is a sum of 2-body Lennard-Jones potentials (main cohesive contribution) and electrostatic terms (repulsive contribution), possibly supplemented by N-body polarisation interactions. There is no restriction imposed on the values of the particle charges and/or polarisabilities. An external field can also be applied to the whole system. The derivatives of the potential energy-surface are determined analytically which ensures an accurate integration of classical equations of motion by a velocity Verlet algorithm. Conservation rules, such as energy conservation or centre-of-mass linear momentum conservation, can be steadily checked during the simulation. The program also provides some statistical information on the run and configuration files that can be used for data post-treatment. MDMC2 is provided with a serial conjugate gradient program, called CGMC2, that uses the same analytical derivatives as MDMC2 and was found useful to probe the minima of the energy landscape explored during Monte Carlo or molecular dynamics simulations performed on multiply charged clusters.

  12. Investigations of high-speed optical transmission systems employing Absolute Added Correlative Coding (AACC)

    NASA Astrophysics Data System (ADS)

    Dong-Nhat, Nguyen; Elsherif, Mohamed A.; Malekmohammadi, Amin

    2016-07-01

    A novel multilevel modulation format based on partial-response signaling called Absolute Added Correlative Coding (AACC) is proposed and numerically demonstrated for high-speed fiber-optic communication systems. A bit error rate (BER) estimation model for the proposed multilevel format has also been developed. The performance of AACC is examined and compared against other prevailing On-Off-Keying and multilevel modulation formats e.g. non-return-to-zero (NRZ), 50% return-to-zero (RZ), 67% carrier-suppressed return-to-zero (CS-RZ), duobinary and four-level pulse-amplitude modulation (4-PAM) in terms of receiver sensitivity, spectral efficiency and dispersion tolerance. Calculated receiver sensitivity at a BER of 10-9 and chromatic dispersion tolerance of the proposed system are ∼-28.3 dBm and ∼336 ps/nm, respectively. The performance of AACC is delineated to be improved by 7.8 dB in terms of receiver sensitivity compared to 4-PAM in back-to-back scenario. The comparison results also show a clear advantage of AACC in achieving longer fiber transmission distance due to the higher dispersion tolerance in optical access networks.

  13. Theoretical models and simulation codes to investigate bystander effects and cellular communication at low doses

    NASA Astrophysics Data System (ADS)

    Ballarini, F.; Alloni, D.; Facoetti, A.; Mairani, A.; Nano, R.; Ottolenghi, A.

    Astronauts in space are continuously exposed to low doses of ionizing radiation from Galactic Cosmic Rays During the last ten years the effects of low radiation doses have been widely re-discussed following a large number of observations on the so-called non targeted effects in particular bystander effects The latter consist of induction of cytogenetic damage in cells not directly traversed by radiation most likely as a response to molecular messengers released by directly irradiated cells Bystander effects which are observed both for lethal endpoints e g clonogenic inactivation and apoptosis and for non-lethal ones e g mutations and neoplastic transformation tend to show non-linear dose responses This might have significant consequences in terms of low-dose risk which is generally calculated on the basis of the Linear No Threshold hypothesis Although the mechanisms underlying bystander effects are still largely unknown it is now clear that two types of cellular communication i e via gap junctions and or release of molecular messengers into the extracellular environment play a fundamental role Theoretical models and simulation codes can be of help in elucidating such mechanisms In the present paper we will review different available modelling approaches including one that is being developed at the University of Pavia The focus will be on the different assumptions adopted by the various authors and on the implications of such assumptions in terms of non-targeted radiobiological damage and more generally low-dose

  14. Investigations of high-speed optical transmission systems employing Absolute Added Correlative Coding (AACC)

    NASA Astrophysics Data System (ADS)

    Dong-Nhat, Nguyen; Elsherif, Mohamed A.; Malekmohammadi, Amin

    2016-07-01

    A novel multilevel modulation format based on partial-response signaling called Absolute Added Correlative Coding (AACC) is proposed and numerically demonstrated for high-speed fiber-optic communication systems. A bit error rate (BER) estimation model for the proposed multilevel format has also been developed. The performance of AACC is examined and compared against other prevailing On-Off-Keying and multilevel modulation formats e.g. non-return-to-zero (NRZ), 50% return-to-zero (RZ), 67% carrier-suppressed return-to-zero (CS-RZ), duobinary and four-level pulse-amplitude modulation (4-PAM) in terms of receiver sensitivity, spectral efficiency and dispersion tolerance. Calculated receiver sensitivity at a BER of 10-9 and chromatic dispersion tolerance of the proposed system are ˜-28.3 dBm and ˜336 ps/nm, respectively. The performance of AACC is delineated to be improved by 7.8 dB in terms of receiver sensitivity compared to 4-PAM in back-to-back scenario. The comparison results also show a clear advantage of AACC in achieving longer fiber transmission distance due to the higher dispersion tolerance in optical access networks.

  15. Diagnostic investigations of DKK-1 and PDCD5 expression levels as independent prognostic markers of human chondrosarcoma.

    PubMed

    Zarea, Mojtaba; Mohammadian Bajgiran, Amirhossein; Sedaghati, Farnoush; Hatami, Negin; Taheriazam, Afshin; Yahaghi, Emad; Shakeri, Mohammadreza

    2016-07-01

    In this study, we investigated the expression levels of Dickkopf-1 (DKK-1) and programmed cell death 5 (PDCD5) by using quantitative real-time PCR and immunohistochemistry in patients with chondrosarcoma. The DKK-1 mRNA levels were significantly higher in chondrosarcoma when compared with the corresponding nontumor tissues (mean ± SD: 4.23 ± 1.54; 1.54 ± 0.87; P = 0.001). PDCD5 mRNA levels were remarkably deceased in tumor tissues when compared with corresponding nontumor tissues (mean ± SD: 1.94 ± 0.73; 5.42 ± 1.73; P = 0.001). The high and moderate DKK-1 expressions were observed for 60% of chondrosarcoma samples in comparison with 27.5% of corresponding nontumor tissues (P  =  0.001). Moreover, low expression of PDCD5 was found in 67.5% of the tumor tissues when compared with the nontumor tissues (32.5%; P = 0.002). The results of this study showed that high DKK-1 expression levels were strongly related to MSTS stage (P = 0.011) and the advancement of histological grade (P < 0.001). Furthermore, the PDCD5 expression levels were correlated with histological grade (P < 0.001), MSTS stage (P = 0.016), and distant metastasis (P = 0.001). Kaplan-Meier survival and log-rank survival showed that patients with high DKK-1 levels and low PDCD5 levels were correlated with shorter overall survival (log-rank test P < 0.001). PDCD5 levels, histological grade, and tumor stage were independent predictors of overall survival. In conclusion, DKK-1 and PDCD5 can be independent predictors of overall survival in patients suffering from chondrosarcoma. © 2016 IUBMB Life, 68(7):597-601, 2016. PMID:27255549

  16. National evaluation of the benefits and risks of greater structuring and coding of the electronic health record: exploratory qualitative investigation

    PubMed Central

    Morrison, Zoe; Fernando, Bernard; Kalra, Dipak; Cresswell, Kathrin; Sheikh, Aziz

    2014-01-01

    Objective We aimed to explore stakeholder views, attitudes, needs, and expectations regarding likely benefits and risks resulting from increased structuring and coding of clinical information within electronic health records (EHRs). Materials and methods Qualitative investigation in primary and secondary care and research settings throughout the UK. Data were derived from interviews, expert discussion groups, observations, and relevant documents. Participants (n=70) included patients, healthcare professionals, health service commissioners, policy makers, managers, administrators, systems developers, researchers, and academics. Results Four main themes arose from our data: variations in documentation practice; patient care benefits; secondary uses of information; and informing and involving patients. We observed a lack of guidelines, co-ordination, and dissemination of best practice relating to the design and use of information structures. While we identified immediate benefits for direct care and secondary analysis, many healthcare professionals did not see the relevance of structured and/or coded data to clinical practice. The potential for structured information to increase patient understanding of their diagnosis and treatment contrasted with concerns regarding the appropriateness of coded information for patients. Conclusions The design and development of EHRs requires the capture of narrative information to reflect patient/clinician communication and computable data for administration and research purposes. Increased structuring and/or coding of EHRs therefore offers both benefits and risks. Documentation standards within clinical guidelines are likely to encourage comprehensive, accurate processing of data. As data structures may impact upon clinician/patient interactions, new models of documentation may be necessary if EHRs are to be read and authored by patients. PMID:24186957

  17. Condition Self-Management in Pediatric Spina Bifida: A Longitudinal Investigation of Medical Adherence, Responsibility-Sharing, and Independence Skills

    PubMed Central

    Psihogios, Alexandra M.; Kolbuck, Victoria

    2015-01-01

    Objective This study aimed to evaluate rates of medical adherence, responsibility, and independence skills across late childhood and adolescence in youth with spina bifida (SB) and to explore associations among these disease self-management variables. Method 111 youth with SB, their parents, and a health professional participated at two time points. Informants completed questionnaires regarding medical adherence, responsibility-sharing, and child independence skills. Results Youth gained more responsibility and independence skills across time, although adherence rates did not follow a similar trajectory. Increased child medical responsibility was related to poorer adherence, and father-reported independence skills were associated with increased child responsibility. Conclusions This study highlights medical domains that are the most difficult for families to manage (e.g., skin checks). Although youth appear to gain more autonomy across time, ongoing parental involvement in medical care may be necessary to achieve optimal adherence across adolescence. PMID:26002195

  18. Investigation of wellbore cooling by circulation and fluid penetration into the formation using a wellbore thermal simulator computer code

    SciTech Connect

    Duda, L.E.

    1987-01-01

    The high temperatures of geothermal wells present severe problems for drilling, logging, and developing these reservoirs. Cooling the wellbore is perhaps the most common method to solve these problems. However, it is usually not clear what may be the most effective wellbore cooling mechanism for a given well. In this paper, wellbore cooling by the use of circulation or by fluid injection into the surrounding rock is investigated using a wellbore thermal simulator computer code. Short circulation times offer no prolonged cooling of the wellbore, but long circulation times (greater than ten or twenty days) greatly reduce the warming rate after shut-in. The dependence of the warming rate on the penetration distance of cooler temperatures into the rock formation (as by fluid injection) is investigated. Penetration distances of greater than 0.6 m appear to offer a substantial reduction in the warming rate. Several plots are shown which demonstrate these effects.

  19. Error-correction coding

    NASA Technical Reports Server (NTRS)

    Hinds, Erold W. (Principal Investigator)

    1996-01-01

    This report describes the progress made towards the completion of a specific task on error-correcting coding. The proposed research consisted of investigating the use of modulation block codes as the inner code of a concatenated coding system in order to improve the overall space link communications performance. The study proposed to identify and analyze candidate codes that will complement the performance of the overall coding system which uses the interleaved RS (255,223) code as the outer code.

  20. Investigation of Nuclear Data Libraries with TRIPOLI-4 Monte Carlo Code for Sodium-cooled Fast Reactors

    NASA Astrophysics Data System (ADS)

    Lee, Y.-K.; Brun, E.

    2014-04-01

    The Sodium-cooled fast neutron reactor ASTRID is currently under design and development in France. Traditional ECCO/ERANOS fast reactor code system used for ASTRID core design calculations relies on multi-group JEFF-3.1.1 data library. To gauge the use of ENDF/B-VII.0 and JEFF-3.1.1 nuclear data libraries in the fast reactor applications, two recent OECD/NEA computational benchmarks specified by Argonne National Laboratory were calculated. Using the continuous-energy TRIPOLI-4 Monte Carlo transport code, both ABR-1000 MWth MOX core and metallic (U-Pu) core were investigated. Under two different fast neutron spectra and two data libraries, ENDF/B-VII.0 and JEFF-3.1.1, reactivity impact studies were performed. Using JEFF-3.1.1 library under the BOEC (Beginning of equilibrium cycle) condition, high reactivity effects of 808 ± 17 pcm and 1208 ± 17 pcm were observed for ABR-1000 MOX core and metallic core respectively. To analyze the causes of these differences in reactivity, several TRIPOLI-4 runs using mixed data libraries feature allow us to identify the nuclides and the nuclear data accounting for the major part of the observed reactivity discrepancies.

  1. Performance investigation of the pulse and Campbelling modes of a fission chamber using a Poisson pulse train simulation code

    NASA Astrophysics Data System (ADS)

    Elter, Zs.; Jammes, C.; Pázsit, I.; Pál, L.; Filliatre, P.

    2015-02-01

    The detectors of the neutron flux monitoring system of the foreseen French GEN-IV sodium-cooled fast reactor (SFR) will be high temperature fission chambers placed in the reactor vessel in the vicinity of the core. The operation of a fission chamber over a wide-range neutron flux will be feasible provided that the overlap of the applicability of its pulse and Campbelling operational modes is ensured. This paper addresses the question of the linearity of these two modes and it also presents our recent efforts to develop a specific code for the simulation of fission chamber pulse trains. Our developed simulation code is described and its overall verification is shown. An extensive quantitative investigation was performed to explore the applicability limits of these two standard modes. It was found that for short pulses the overlap between the pulse and Campbelling modes can be guaranteed if the standard deviation of the background noise is not higher than 5% of the pulse amplitude. It was also shown that the Campbelling mode is sensitive to parasitic noise, while the performance of the pulse mode is affected by the stochastic amplitude distributions.

  2. Independent assessment of TRAC-PF1 (Version 7. 0), RELAP5/MOD1 (Cycle 14), and TRAC-BD1 (Version 12. 0) codes using separate-effects experiments

    SciTech Connect

    Saha, P; Jo, J H; Neymotin, L; Rohatgi, U S; Slovik, G C; Yuelys-Miksis, C

    1985-08-01

    This report presents the results of independent code assessment conducted at BNL. The TRAC-PF1 (Version 7.0) and RELAP5/MOD1 (Cycle 14) codes were assessed using the critical flow tests, level swell test, countercurrent flow limitation (CCFL) tests, post-CHF test, steam generator thermal performance tests, and natural circulation tests. TRAC-BD1 (Version 12.0) was applied only to the CCFL and post-CHF tests. The TRAC-PWR series of codes, i.e., TRAC-P1A, TRAC-PD2, and TRAC-PF1, have been gradually improved. However, TRAC-PF1 appears to need improvement in almost all categories of tests/phenomena attempted to BNL. Of the two codes, TRAC-PF1 and RELAP5/MOD1, the latter needs more improvement particularly in the areas of: CCFL, Level swell, CHF correlation and post-CHF heat transfer, and Numerical stability. For the CCFL and post-CHF tests, TRAC-BD1 provides the best overall results. However, the TRAC-BD1 interfacial shear package for the countercurrent annular flow regime needs further improvement for better prediction of CCFL phenomenon. 47 refs., 87 figs., 15 tabs.

  3. Why comply with a code of ethics?

    PubMed

    Spielthenner, Georg

    2015-05-01

    A growing number of professional associations and occupational groups are creating codes of ethics with the goal of guiding their members, protecting service users, and safeguarding the reputation of the profession. There is a great deal of literature dealing with the question to what extent ethical codes can achieve their desired objectives. The present paper does not contribute to this debate. Its aim is rather to investigate how rational it is to comply with codes of conduct. It is natural and virtually inevitable for a reflective person to ask why one should pay any attention to ethical codes, in particular if following a code is not in one's own interest. In order to achieve the aim of this paper, I shall (in "Quasi-reasons for complying with an ethical code" section) discuss reasons that only appear to be reasons for complying with a code. In "Code-independent reasons" section, I shall present genuine practical reasons that, however, turn out to be reasons of the wrong kind. In "Code-dependent reasons" section finally presents the most important reasons for complying with ethical codes. The paper argues that while ethical codes do not necessarily yield reasons for action, professionals can have genuine reasons for complying with a code, which may, however, be rather weak and easily overridden by reasons for deviating from the code. PMID:25185873

  4. Industry and Occupation in the Electronic Health Record: An Investigation of the National Institute for Occupational Safety and Health Industry and Occupation Computerized Coding System

    PubMed Central

    2016-01-01

    Background Inclusion of information about a patient’s work, industry, and occupation, in the electronic health record (EHR) could facilitate occupational health surveillance, better health outcomes, prevention activities, and identification of workers’ compensation cases. The US National Institute for Occupational Safety and Health (NIOSH) has developed an autocoding system for “industry” and “occupation” based on 1990 Bureau of Census codes; its effectiveness requires evaluation in conjunction with promoting the mandatory addition of these variables to the EHR. Objective The objective of the study was to evaluate the intercoder reliability of NIOSH’s Industry and Occupation Computerized Coding System (NIOCCS) when applied to data collected in a community survey conducted under the Affordable Care Act; to determine the proportion of records that are autocoded using NIOCCS. Methods Standard Occupational Classification (SOC) codes are used by several federal agencies in databases that capture demographic, employment, and health information to harmonize variables related to work activities among these data sources. There are 359 industry and occupation responses that were hand coded by 2 investigators, who came to a consensus on every code. The same variables were autocoded using NIOCCS at the high and moderate criteria level. Results Kappa was .84 for agreement between hand coders and between the hand coder consensus code versus NIOCCS high confidence level codes for the first 2 digits of the SOC code. For 4 digits, NIOCCS coding versus investigator coding ranged from kappa=.56 to .70. In this study, NIOCCS was able to achieve production rates (ie, to autocode) 31%-36% of entered variables at the “high confidence” level and 49%-58% at the “medium confidence” level. Autocoding (production) rates are somewhat lower than those reported by NIOSH. Agreement between manually coded and autocoded data are “substantial” at the 2-digit level, but only

  5. Experimental investigation of a 10-percent-thick helicopter rotor airfoil section designed with a viscous transonic analysis code

    NASA Technical Reports Server (NTRS)

    Noonan, K. W.

    1981-01-01

    An investigation was conducted in the Langley 6- by 28-Inch Transonic Tunnel to determine the two dimensional aerodynamic characteristics of a 10-percent-thick helicopter rotor airfoil at Mach numbers from 0.33 to 0.87 and respective Reynolds numbers from 4.9 x 10 to the 6th to 9.8 x 10 to the 6th. This airfoil, designated the RC-10(N)-1, was also investigated at Reynolds numbers from 3.0 x 10 to the 6th to 7.3 x 10 to the 6th at respective Mach numbers of 0.33 to 0.83 for comparison wit the SC 1095 (with tab) airfoil. The RC-10(N)-1 airfoil was designed by the use of a viscous transonic analysis code. The results of the investigation indicate that the RC-10(N)-1 airfoil met all the design goals. At a Reynolds number of about 9.4 x 10 to the 6th the drag divergence Mach number at zero normal-force coefficient was 0.815 with a corresponding pitching-moment coefficient of zero. The drag divergence Mach number at a normal-force coefficient of 0.9 and a Reynolds number of about 8.0 x 10 to the 6th was 0.61. The drag divergence Mach number of this new airfoil was higher than that of the SC 1095 airfoil at normal-force coefficients above 0.3. Measurements in the same wind tunnel at comparable Reynolds numbers indicated that the maximum normal-force coefficient of the RC-10(N)-1 airfoil was higher than that of the NACA 0012 airfoil for Mach numbers above about 0.35 and was about the same as that of the SC 1095 airfoil for Mach numbers up to 0.5.

  6. "Sample-Independent" Item Parameters? An Investigation of the Stability of IRT Item Parameters Estimated from Small Data Sets.

    ERIC Educational Resources Information Center

    Sireci, Stephen G.

    Whether item response theory (IRT) is useful to the small-scale testing practitioner is examined. The stability of IRT item parameters is evaluated with respect to the classical item parameters (i.e., p-values, biserials) obtained from the same data set. Previous research investigating the effect of sample size on IRT parameter estimation has…

  7. Independent Technical Investigation of the Puna Geothermal Venture Unplanned Steam Release, June 12 and 13, 1991, Puna, Hawaii

    SciTech Connect

    Thomas, Richard; Whiting, Dick; Moore, James; Milner, Duey

    1991-07-01

    On June 24, 1991, a third-party investigation team consisting of Richard P. Thomas, Duey E. Milner, James L. Moore, and Dick Whiting began an investigation into the blowout of well KS-8, which occurred at the Puna Geothermal Venture (PGV) site on June 12, 1991, and caused the unabated release of steam for a period of 31 hours before PGV succeeded in closing in the well. The scope of the investigation was to: (a) determine the cause(s) of the incident; (b) evaluate the adequacy of PGVs drilling and blowout prevention equipment and procedures; and (c) make recommendations for any appropriate changes in equipment and/or procedures. This report finds that the blowout occurred because of inadequacies in PGVs drilling plan and procedures and not as a result of unusual or unmanageable subsurface geologic or hydrologic conditions. While the geothermal resource in the area being drilled is relatively hot, the temperatures are not excessive for modem technology and methods to control. Fluid pressures encountered are also manageable if proper procedures are followed and the appropriate equipment is utilized. A previous blowout of short duration occurred on February 21, 1991, at the KS-7 injection well being drilled by PGV at a depth of approximately 1600'. This unexpected incident alerted PGV to the possibility of encountering a high temperature, fractured zone at a relatively shallow depth. The experience at KS-7 prompted PGV to refine its hydrological model; however, the drilling plan utilized for KS-8 was not changed. Not only did PGV fail to modify its drilling program following the KS-7 blowout, but they also failed to heed numerous ''red flags'' (warning signals) in the five days preceding the KS-8 blowout, which included a continuous 1-inch flow of drilling mud out of the wellbore, gains in mud volume while pulling stands, and gas entries while circulating muds bottoms up, in addition to lost circulation that had occurred earlier below the shoe of the 13-3/8-hch casing.

  8. DNA codes

    SciTech Connect

    Torney, D. C.

    2001-01-01

    across the two blocks. For the foregoing reasons, these two blocks of codewords suffice as the hooks and loops of a digital Velcro. We began our investigations of such codes by constructing quaternary BCH reverse-complement codes, using cyclic codes and conventional Hamming distance [4]. We also obtained upper and lower bounds on the rate of reverse-complement codes with a metric function based on the foregoing similarities [3]. For most applications involving DNA, however, the reverse-complementary analogue of codes based on the insertion-deletion distance is more advantageous. This distance equals the codeword length minus the longest length of a common (not necessarily contiguous) subsequence. (The 'aligned' codes described above may be used under special experimental conditions), The advantage arises because, under the assumption that DNA is very flexible, the sharing of sufficiently long subsequences between codewords would be tantamount to the ability of one of their reverse complements to form a double strand with the other codeword. Thus far, using the random coding method, we have derived an asymptotic lower bound on the rate of reverse-complement insertion-deletion codes, as a function of the insertion-deletion distance fraction and of the alphabet size [1]. For the quaternary DNA alphabet of primary importance, this lower bound yields an asymptotically positive rate if the insertion-deletion-distance fraction does not exceed the threshold {approx} 0.19. Extensions of the Varsamov-Tenengol'ts construction of insertion-deletion codes [5] for reverse-complement insertion-deletion codes will be described. Experiments have been performed involving some of our DNA codes.

  9. Speech coding

    SciTech Connect

    Ravishankar, C., Hughes Network Systems, Germantown, MD

    1998-05-08

    Speech is the predominant means of communication between human beings and since the invention of the telephone by Alexander Graham Bell in 1876, speech services have remained to be the core service in almost all telecommunication systems. Original analog methods of telephony had the disadvantage of speech signal getting corrupted by noise, cross-talk and distortion Long haul transmissions which use repeaters to compensate for the loss in signal strength on transmission links also increase the associated noise and distortion. On the other hand digital transmission is relatively immune to noise, cross-talk and distortion primarily because of the capability to faithfully regenerate digital signal at each repeater purely based on a binary decision. Hence end-to-end performance of the digital link essentially becomes independent of the length and operating frequency bands of the link Hence from a transmission point of view digital transmission has been the preferred approach due to its higher immunity to noise. The need to carry digital speech became extremely important from a service provision point of view as well. Modem requirements have introduced the need for robust, flexible and secure services that can carry a multitude of signal types (such as voice, data and video) without a fundamental change in infrastructure. Such a requirement could not have been easily met without the advent of digital transmission systems, thereby requiring speech to be coded digitally. The term Speech Coding is often referred to techniques that represent or code speech signals either directly as a waveform or as a set of parameters by analyzing the speech signal. In either case, the codes are transmitted to the distant end where speech is reconstructed or synthesized using the received set of codes. A more generic term that is applicable to these techniques that is often interchangeably used with speech coding is the term voice coding. This term is more generic in the sense that the

  10. Binary primitive alternant codes

    NASA Technical Reports Server (NTRS)

    Helgert, H. J.

    1975-01-01

    In this note we investigate the properties of two classes of binary primitive alternant codes that are generalizations of the primitive BCH codes. For these codes we establish certain equivalence and invariance relations and obtain values of d and d*, the minimum distances of the prime and dual codes.

  11. Inter-Sentential Patterns of Code-Switching: A Gender-Based Investigation of Male and Female EFL Teachers

    ERIC Educational Resources Information Center

    Gulzar, Malik Ajmal; Farooq, Muhammad Umar; Umer, Muhammad

    2013-01-01

    This article has sought to contribute to discussions concerning the value of inter-sentential patterns of code-switching (henceforth ISPCS) particularly in the context of EFL classrooms. Through a detailed analysis of recorded data produced in that context, distinctive features in the discourse were discerned which were associated with males' and…

  12. Are Independent Probes Truly Independent?

    ERIC Educational Resources Information Center

    Camp, Gino; Pecher, Diane; Schmidt, Henk G.; Zeelenberg, Rene

    2009-01-01

    The independent cue technique has been developed to test traditional interference theories against inhibition theories of forgetting. In the present study, the authors tested the critical criterion for the independence of independent cues: Studied cues not presented during test (and unrelated to test cues) should not contribute to the retrieval…

  13. Supporting the Cybercrime Investigation Process: Effective Discrimination of Source Code Authors Based on Byte-Level Information

    NASA Astrophysics Data System (ADS)

    Frantzeskou, Georgia; Stamatatos, Efstathios; Gritzalis, Stefanos

    Source code authorship analysis is the particular field that attempts to identify the author of a computer program by treating each program as a linguistically analyzable entity. This is usually based on other undisputed program samples from the same author. There are several cases where the application of such a method could be of a major benefit, such as tracing the source of code left in the system after a cyber attack, authorship disputes, proof of authorship in court, etc. In this paper, we present our approach which is based on byte-level n-gram profiles and is an extension of a method that has been successfully applied to natural language text authorship attribution. We propose a simplified profile and a new similarity measure which is less complicated than the algorithm followed in text authorship attribution and it seems more suitable for source code identification since is better able to deal with very small training sets. Experiments were performed on two different data sets, one with programs written in C++ and the second with programs written in Java. Unlike the traditional language-dependent metrics used by previous studies, our approach can be applied to any programming language with no additional cost. The presented accuracy rates are much better than the best reported results for the same data sets.

  14. Parallelization of the SIR code for the investigation of small-scale features in the solar photosphere

    NASA Astrophysics Data System (ADS)

    Thonhofer, Stefan; Rubio, Luis R. Bellot; Utz, Dominik; Hanslmeier, Arnold; Jurçák, Jan

    2015-10-01

    Magnetic fields are one of the most important drivers of the highly dynamic processes that occur in the lower solar atmosphere. They span a broad range of sizes, from large- and intermediate-scale structures such as sunspots, pores and magnetic knots, down to the smallest magnetic elements observable with current telescopes. On small scales, magnetic flux tubes are often visible as Magnetic Bright Points (MBPs). Apart from simple V/I magnetograms, the most common method to deduce their magnetic properties is the inversion of spectropolarimetric data. Here we employ the SIR code for that purpose. SIR is a well-established tool that can derive not only the magnetic field vector and other atmospheric parameters (e.g., temperature, line-of-sight velocity), but also their stratifications with height, effectively producing 3-dimensional models of the lower solar atmosphere. In order to enhance the runtime performance and the usability of SIR we parallelized the existing code and standardized the input and output formats. This and other improvements make it feasible to invert extensive high-resolution data sets within a reasonable amount of computing time. An evaluation of the speedup of the parallel SIR code shows a substantial improvement in runtime.

  15. Multigenerational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior Health (MICEHAB): An Investigation of a Long Duration, Partial Gravity, Autonomous Rodent Colony

    NASA Technical Reports Server (NTRS)

    Rodgers, Erica M.; Simon, Matthew A.; Antol, Jeffrey; Chai, Patrick R.; Jones, Christopher A.; Klovstad, Jordan J.; Neilan, James H.; Stillwagen, Frederic H.; Williams, Phillip A.; Bednara, Michael; Guendel, Alex; Hernandez, Joel; Lewis, Weston; Lim, Jeremy; Wilson, Logan; Wusk, Grace

    2015-01-01

    The path from Earth to Mars requires exploration missions to be increasingly Earth-independent as the foundation is laid for a sustained human presence in the following decades. NASA pioneering of Mars will expand the boundaries of human exploration, as a sustainable presence on the surface requires humans to successfully reproduce in a partial gravity environment independent from Earth intervention. Before significant investment is made in capabilities leading to such pioneering efforts, the challenges of multigenerational mammalian reproduction in a partial gravity environment need be investigated. The Multi-generational Independent Colony for Extraterrestrial Habitation, Autonomy, and Behavior health is designed to study these challenges. The proposed concept is a conceptual, long duration, autonomous habitat designed to house rodents in a partial gravity environment with the goal of understanding the effects of partial gravity on mammalian reproduction over multiple generations and how to effectively design such a facility to operate autonomously while keeping the rodents healthy in order to achieve multiple generations. All systems are designed to feed forward directly to full-scale human missions to Mars. This paper presents the baseline design concept formulated after considering challenges in the mission and vehicle architectures such as: vehicle automation, automated crew health management/medical care, unique automated waste disposal and hygiene, handling of deceased crew members, reliable long-duration crew support systems, and radiation protection. This concept was selected from an architectural trade space considering the balance between mission science return and robotic and autonomy capabilities. The baseline design is described in detail including: transportation and facility operation constraints, artificial gravity system design, habitat design, and a full-scale mock-up demonstration of autonomous rodent care facilities. The proposed concept has

  16. Investigating mitochondrial metabolism in contracting HL-1 cardiomyocytes following hypoxia and pharmacological HIF activation identifies HIF-dependent and independent mechanisms of regulation.

    PubMed

    Ambrose, Lucy J A; Abd-Jamil, Amira H; Gomes, Renata S M; Carter, Emma E; Carr, Carolyn A; Clarke, Kieran; Heather, Lisa C

    2014-11-01

    Hypoxia is a consequence of cardiac disease and downregulates mitochondrial metabolism, yet the molecular mechanisms through which this occurs in the heart are incompletely characterized. Therefore, we aimed to use a contracting HL-1 cardiomyocyte model to investigate the effects of hypoxia on mitochondrial metabolism. Cells were exposed to hypoxia (2% O2) for 6, 12, 24, and 48 hours to characterize the metabolic response. Cells were subsequently treated with the hypoxia inducible factor (HIF)-activating compound, dimethyloxalylglycine (DMOG), to determine whether hypoxia-induced mitochondrial changes were HIF dependent or independent, and to assess the suitability of this cultured cardiac cell line for cardiovascular pharmacological studies. Hypoxic cells had increased glycolysis after 24 hours, with glucose transporter 1 and lactate levels increased 5-fold and 15-fold, respectively. After 24 hours of hypoxia, mitochondrial networks were more fragmented but there was no change in citrate synthase activity, indicating that mitochondrial content was unchanged. Cellular oxygen consumption was 30% lower, accompanied by decreases in the enzymatic activities of electron transport chain (ETC) complexes I and IV, and aconitase by 81%, 96%, and 72%, relative to controls. Pharmacological HIF activation with DMOG decreased cellular oxygen consumption by 43%, coincident with decreases in the activities of aconitase and complex I by 26% and 30%, indicating that these adaptations were HIF mediated. In contrast, the hypoxia-mediated decrease in complex IV activity was not replicated by DMOG treatment, suggesting HIF-independent regulation of this complex. In conclusion, 24 hours of hypoxia increased anaerobic glycolysis and decreased mitochondrial respiration, which was associated with changes in ETC and tricarboxylic acid cycle enzyme activities in contracting HL-1 cells. Pharmacological HIF activation in this cardiac cell line allowed both HIF-dependent and independent

  17. An investigation for population maintenance mechanism in a miniature garden: genetic connectivity or independence of small islet populations of the Ryukyu five-lined skink.

    PubMed

    Kurita, Kazuki; Hikida, Tsutomu; Toda, Mamoru

    2014-01-01

    The Ryukyu five-lined skink (Plestiodon marginatus) is an island lizard that is even found in tiny islets with less than half a hectare of habitat area. We hypothesized that the island populations are maintained under frequent gene flow among the islands or independent of each other. To test our hypotheses, we investigated genetic structure of 21 populations from 11 land-bridge islands that were connected during the latest glacial age, and 4 isolated islands. Analyses using mitochondrial cytochrome b gene sequence (n = 67) and 10 microsatellite loci (n = 235) revealed moderate to high levels of genetic differentiation, existence of many private alleles/haplotypes in most islands, little contemporary migration, a positive correlation between genetic variability and island area, and a negative correlation between relatedness and island area. These evidences suggest a strong effect of independent genetic drift as opposed to gene flow, favoring the isolation hypothesis even in tiny islet populations. Isolation-by-distance effect was demonstrated and it became more prominent when the 4 isolated islands were excluded, suggesting that the pattern is a remnant of the land-bridge age. In a few island populations, however, the possibility of occasional overwater dispersals was partially supported and therefore could not be ruled out. PMID:25189776

  18. Microbial diversity and dynamics throughout manufacturing and ripening of surface ripened semi-hard Danish Danbo cheeses investigated by culture-independent techniques.

    PubMed

    Ryssel, Mia; Johansen, Pernille; Al-Soud, Waleed Abu; Sørensen, Søren; Arneborg, Nils; Jespersen, Lene

    2015-12-23

    Microbial successions on the surface and in the interior of surface ripened semi-hard Danish Danbo cheeses were investigated by culture-dependent and -independent techniques. Culture-independent detection of microorganisms was obtained by denaturing gradient gel electrophoresis (DGGE) and pyrosequencing, using amplicons of 16S and 26S rRNA genes for prokaryotes and eukaryotes, respectively. With minor exceptions, the results from the culture-independent analyses correlated to the culture-dependent plating results. Even though the predominant microorganisms detected with the two culture-independent techniques correlated, a higher number of genera were detected by pyrosequencing compared to DGGE. Additionally, minor parts of the microbiota, i.e. comprising <10.0% of the operational taxonomic units (OTUs), were detected by pyrosequencing, resulting in more detailed information on the microbial succession. As expected, microbial profiles of the surface and the interior of the cheeses diverged. During cheese production pyrosequencing determined Lactococcus as the dominating genus on cheese surfaces, representing on average 94.7%±2.1% of the OTUs. At day 6 Lactococcus spp. declined to 10.0% of the OTUs, whereas Staphylococcus spp. went from 0.0% during cheese production to 75.5% of the OTUs at smearing. During ripening, i.e. from 4 to 18 weeks, Corynebacterium was the dominant genus on the cheese surface (55.1%±9.8% of the OTUs), with Staphylococcus (17.9%±11.2% of the OTUs) and Brevibacterium (10.4%±8.3% of the OTUs) being the second and third most abundant genera. Other detected bacterial genera included Clostridiisalibacter (5.0%±4.0% of the OTUs), as well as Pseudoclavibacter, Alkalibacterium and Marinilactibacillus, which represented <2% of the OTUs. At smearing, yeast counts were low with Debaryomyces being the dominant genus accounting for 46.5% of the OTUs. During ripening the yeast counts increased significantly with Debaryomyces being the predominant genus

  19. Environmental health and safety independent investigation of the in situ vitrification melt expulsion at the Oak Ridge National Laboratory, Oak Ridge, Tennessee

    SciTech Connect

    1996-07-01

    At about 6:12 pm, EDT on April 21, 1996, steam and molten material were expelled from Pit 1 in situ vitrification (ISV) project at the Oak Ridge National Laboratory (ORNL). At the request of the director of the Environmental Restoration (ER) Division, Department of Energy Oak Ridge Operations (DOE ORO), an independent investigation team was established on April 26, 1996. This team was tasked to determine the facts related to the ORNL Pit 1 melt expulsion event (MEE) in the areas of environment safety and health concerns such as the adequacy of the ISV safety systems; operational control restrictions; emergency response planning/execution; and readiness review, and report the investigation team findings within 45 days from the date of incident. These requirements were stated in the letter of appointment presented in Appendix A of this report. This investigation did not address the physical causes of the MEE. A separate investigation was conducted by ISV project personnel to determine the causes of the melt expulsion and the extent of the effects of this phenomenon. In response to this event, occurrence report ORO-LMES-X10ENVRES-1996-0006 (Appendix B) was filed. The investigation team did not address the occurrence reporting or event notification process. The project personnel (project team) examined the physical evidence at Pit 1 ISV site (e.g., the ejected melt material and the ISV hood), reviewed documents such as the site- specific health and safety plan (HASP), and interviewed personnel involved in the event and/or the project. A listing of the personnel interviewed and evidence reviewed is provided in Appendix C.

  20. Investigation of island formation due to RMPs in DIII-D plasmas with the SIESTA resistive MHD equilibrium code

    NASA Astrophysics Data System (ADS)

    Hirshman, S. P.; Shafer, M. W.; Seal, S. K.; Canik, J. M.

    2016-04-01

    > The SIESTA magnetohydrodynamic (MHD) equilibrium code has been used to compute a sequence of ideally stable equilibria resulting from numerical variation of the helical resonant magnetic perturbation (RMP) applied to an axisymmetric DIII-D plasma equilibrium. Increasing the perturbation strength at the dominant , resonant surface leads to lower MHD energies and increases in the equilibrium island widths at the (and sidebands) surfaces, in agreement with theoretical expectations. Island overlap at large perturbation strengths leads to stochastic magnetic fields which correlate well with the experimentally inferred field structure. The magnitude and spatial phase (around the dominant rational surfaces) of the resonant (shielding) component of the parallel current are shown to change qualitatively with the magnetic island topology.

  1. Salam's independence

    NASA Astrophysics Data System (ADS)

    Fraser, Gordon

    2009-01-01

    In his kind review of my biography of the Nobel laureate Abdus Salam (December 2008 pp45-46), John W Moffat wrongly claims that Salam had "independently thought of the idea of parity violation in weak interactions".

  2. Independence of Internal Auditors.

    ERIC Educational Resources Information Center

    Montondon, Lucille; Meixner, Wilda F.

    1993-01-01

    A survey of 288 college and university auditors investigated patterns in their appointment, reporting, and supervisory practices as indicators of independence and objectivity. Results indicate a weakness in the positioning of internal auditing within institutions, possibly compromising auditor independence. Because the auditing function is…

  3. MORSE Monte Carlo code

    SciTech Connect

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  4. Tokamak Systems Code

    SciTech Connect

    Reid, R.L.; Barrett, R.J.; Brown, T.G.; Gorker, G.E.; Hooper, R.J.; Kalsi, S.S.; Metzler, D.H.; Peng, Y.K.M.; Roth, K.E.; Spampinato, P.T.

    1985-03-01

    The FEDC Tokamak Systems Code calculates tokamak performance, cost, and configuration as a function of plasma engineering parameters. This version of the code models experimental tokamaks. It does not currently consider tokamak configurations that generate electrical power or incorporate breeding blankets. The code has a modular (or subroutine) structure to allow independent modeling for each major tokamak component or system. A primary benefit of modularization is that a component module may be updated without disturbing the remainder of the systems code as long as the imput to or output from the module remains unchanged.

  5. 3D PiC code simulations for a laboratory experimental investigation of Auroral Kilometric Radiation mechanisms

    NASA Astrophysics Data System (ADS)

    Gillespie, K. M.; Speirs, D. C.; Ronald, K.; McConville, S. L.; Phelps, A. D. R.; Bingham, R.; Cross, A. W.; Robertson, C. W.; Whyte, C. G.; He, W.; Vorgul, I.; Cairns, R. A.; Kellett, B. J.

    2008-12-01

    Auroral Kilometric Radiation (AKR), occurs naturally in the polar regions of the Earth's magnetosphere where electrons are accelerated by electric fields into the increasing planetary magnetic dipole. Here conservation of the magnetic moment converts axial to rotational momentum forming a horseshoe distribution in velocity phase space. This distribution is unstable to cyclotron emission with radiation emitted in the X-mode. In a scaled laboratory reproduction of this process, a 75-85 keV electron beam of 5-40 A was magnetically compressed by a system of solenoids and emissions were observed for cyclotron frequencies of 4.42 GHz and 11.7 GHz resonating with near cut-off TE0,1 and TE0,3 modes, respectively. Here we compare these measurements with numerical predictions from the 3D PiC code KARAT. The 3D simulations accurately predicted the radiation modes and frequencies produced by the experiment. The predicted conversion efficiency between electron kinetic and wave field energy of around 1% is close to the experimental measurements and broadly consistent with quasi-linear theoretical analysis and geophysical observations.

  6. Evaluation of EPICOR-II Resin/Liner lysimeter investigation data using MIXBATH'' a one-dimensional transport code

    SciTech Connect

    McConnell, J.W.; Rogers, R.D. ); Brey, R.R. ); Sullivan, T.M. )

    1992-01-01

    The computer code MIXBATH has been applied to compare model predictions with six years of leachate collection data from five lysimeters located at Oak Ridge and five located at Argonne National Laboratories. The goal of this study was to critique the applicability of these data for use as a basis for the long-term prediction of release and transport of radionuclides contained in Portland type I-II cement and Dow vinyl ester-styrene waste forms loaded with EPICOR-II prefilter ion exchange resins. MIXBATH was useful in providing insight into information needs for long-term performance assessment. In most cases, the total activity released from the lysimeters over the test period was indistinguishable from background, indicating a need for longer-term data collection. In cases where there was both sufficient information available and activity released, MIXBATH was able to predict releases within an order of magnitude of those measured. Releases are extremely sensitive to the soil partition coefficient and waste form diffusion coefficient, and these were identified as the key data needs for long-term performance assessment.

  7. Evaluation of EPICOR-II Resin/Liner lysimeter investigation data using ``MIXBATH`` a one-dimensional transport code

    SciTech Connect

    McConnell, J.W.; Rogers, R.D.; Brey, R.R.; Sullivan, T.M.

    1992-08-01

    The computer code MIXBATH has been applied to compare model predictions with six years of leachate collection data from five lysimeters located at Oak Ridge and five located at Argonne National Laboratories. The goal of this study was to critique the applicability of these data for use as a basis for the long-term prediction of release and transport of radionuclides contained in Portland type I-II cement and Dow vinyl ester-styrene waste forms loaded with EPICOR-II prefilter ion exchange resins. MIXBATH was useful in providing insight into information needs for long-term performance assessment. In most cases, the total activity released from the lysimeters over the test period was indistinguishable from background, indicating a need for longer-term data collection. In cases where there was both sufficient information available and activity released, MIXBATH was able to predict releases within an order of magnitude of those measured. Releases are extremely sensitive to the soil partition coefficient and waste form diffusion coefficient, and these were identified as the key data needs for long-term performance assessment.

  8. Clinical coding. Code breakers.

    PubMed

    Mathieson, Steve

    2005-02-24

    --The advent of payment by results has seen the role of the clinical coder pushed to the fore in England. --Examinations for a clinical coding qualification began in 1999. In 2004, approximately 200 people took the qualification. --Trusts are attracting people to the role by offering training from scratch or through modern apprenticeships. PMID:15768716

  9. Prognostic investigations of B7-H1 and B7-H4 expression levels as independent predictor markers of renal cell carcinoma.

    PubMed

    Safaei, Hamid Reza; Rostamzadeh, Ayoob; Rahmani, Omid; Mohammadi, Mohsen; Ghaderi, Omar; Yahaghi, Hamid; Ahmadi, Koroosh

    2016-06-01

    In order to evaluate the correlation of B7-H4 and B7-H1 with renal cell carcinoma (RCC), we analyzed B7-H1 and B7-H4 expressions and their clinical significance by immunohistochemical method. Our result indicated that B7-H4-positive staining was detected in 58.13 % of RCC tissues (25 tissues tumors), and there were 18 tissues of patients without detectable B7-H4. Furthermore, 21 cases (48.83 %) were B7-H1-positive. Positive tumor expressions of B7-H4 and B7-H1 were markedly related to advanced TNM stage (P = 0.001; P = 0.014), high grade (P = 0.001; P = 002), and larger tumor size (P = 0.002; P = 024) in RCC tissues than patients with B7-H4-negative and B7-H1-negative in RCC tissues. The patients with B7-H1 and B7-H4-positive expressions were found to be markedly correlated with the overall survival of the patients (P < 0.05) and tended to have an increased risk of death when compared with negative expression groups. Univariate analysis showed that B7-H4 and B7-H1 expressions, TNM stage, high grade, and tumor size were significantly related to the prognosis of RCC. Furthermore, multivariate analysis showed that B7-H4 and B7-H1 expressions decreased overall survival. The adjusted HR for B7-H1 was 2.83 (95 % CI 1.210-2.971; P = 0.031) and also was 2.918 (95 % CI 1.243-3.102; P = 0.006) for B7-H4 that showed these markers were independent prognostic factors in RCC patients. The expressions of B7-H1 and B7-H4 in RCC patients indicate that these markers may be as a predictor of tumor development and death risk. Further investigations can be helpful to confirm B7-H1 and B7-H4 roles as an independent predictor of clinical RCC outcome. PMID:26687644

  10. Numerical investigations on pressurized AL-composite vessel response to hypervelocity impacts: Comparison between experimental works and a numerical code

    NASA Astrophysics Data System (ADS)

    Mespoulet, Jérôme; Plassard, Fabien; Hereil, Pierre-Louis

    2015-09-01

    Response of pressurized composite-Al vessels to hypervelocity impact of aluminum spheres have been numerically investigated to evaluate the influence of initial pressure on the vulnerability of these vessels. Investigated tanks are carbon-fiber overwrapped prestressed Al vessels. Explored internal air pressure ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from experiments (Xray radiographies, particle velocity measurement and post-mortem vessels) have been compared to numerical results given from LS-DYNA ALE-Lagrange-SPH full coupling models. Simulations exhibit an under estimation in term of debris cloud evolution and shock wave propagation in pressurized air but main modes of damage/rupture on the vessels given by simulations are coherent with post-mortem recovered vessels from experiments. First results of this numerical work are promising and further simulation investigations with additional experimental data will be done to increase the reliability of the simulation model. The final aim of this crossed work is to numerically explore a wide range of impact conditions (impact angle, projectile weight, impact velocity, initial pressure) that cannot be explore experimentally. Those whole results will define a rule of thumbs for the definition of a vulnerability analytical model for a given pressurized vessel.

  11. On the Minimum Weight of Simple Full-Length Array LDPC Codes

    NASA Astrophysics Data System (ADS)

    Sugiyama, Kenji; Kaji, Yuichi

    We investigate the minimum weights of simple full-length array LDPC codes (SFA-LDPC codes). The SFA-LDPC codes are a subclass of LDPC codes, and constructed algebraically according to two integer parameters p and j. Mittelholzer and Yang et al. have studied the minimum weights of SFA-LDPC codes, but the exact minimum weights of the codes are not known except for some small p and j. In this paper, we show that the minimum weights of the SFA-LDPC codes with j=4 and j=5 are upper-bounded by 10 and 12, respectively, independent from the prime number p. By combining the results with Yang's lower-bound limits, we can conclude that the minimum weights of the SFA-LDPC codes with j=4 and p>7 are exactly 10 and those of the SFA-LDPC codes with j=5 are 10 or 12.

  12. System and method for investigating sub-surface features of a rock formation with acoustic sources generating coded signals

    SciTech Connect

    Vu, Cung Khac; Nihei, Kurt; Johnson, Paul A; Guyer, Robert; Ten Cate, James A; Le Bas, Pierre-Yves; Larmat, Carene S

    2014-12-30

    A system and a method for investigating rock formations includes generating, by a first acoustic source, a first acoustic signal comprising a first plurality of pulses, each pulse including a first modulated signal at a central frequency; and generating, by a second acoustic source, a second acoustic signal comprising a second plurality of pulses. A receiver arranged within the borehole receives a detected signal including a signal being generated by a non-linear mixing process from the first-and-second acoustic signal in a non-linear mixing zone within the intersection volume. The method also includes-processing the received signal to extract the signal generated by the non-linear mixing process over noise or over signals generated by a linear interaction process, or both.

  13. Understanding independence

    NASA Astrophysics Data System (ADS)

    Annan, James; Hargreaves, Julia

    2016-04-01

    In order to perform any Bayesian processing of a model ensemble, we need a prior over the ensemble members. In the case of multimodel ensembles such as CMIP, the historical approach of ``model democracy'' (i.e. equal weight for all models in the sample) is no longer credible (if it ever was) due to model duplication and inbreeding. The question of ``model independence'' is central to the question of prior weights. However, although this question has been repeatedly raised, it has not yet been satisfactorily addressed. Here I will discuss the issue of independence and present a theoretical foundation for understanding and analysing the ensemble in this context. I will also present some simple examples showing how these ideas may be applied and developed.

  14. Application of a multi-block CFD code to investigate the impact of geometry modeling on centrifugal compressor flow field predictions

    SciTech Connect

    Hathaway, M.D.; Wood, J.R.

    1997-10-01

    CFD codes capable of utilizing multi-block grids provide the capability to analyze the complete geometry of centrifugal compressors. Attendant with this increased capability is potentially increased grid setup time and more computational overhead with the resultant increase in wall clock time to obtain a solution. If the increase in difficulty of obtaining a solution significantly improves the solution from that obtained by modeling the features of the tip clearance flow or the typical bluntness of a centrifugal compressor`s trailing edge, then the additional burden is worthwhile. However, if the additional information obtained is of marginal use, then modeling of certain features of the geometry may provide reasonable solutions for designers to make comparative choices when pursuing a new design. In this spirit a sequence of grids were generated to study the relative importance of modeling versus detailed gridding of the tip gap and blunt trailing edge regions of the NASA large low-speed centrifugal compressor for which there is considerable detailed internal laser anemometry data available for comparison. The results indicate: (1) There is no significant difference in predicted tip clearance mass flow rate whether the tip gap is gridded or modeled. (2) Gridding rather than modeling the trailing edge results in better predictions of some flow details downstream of the impeller, but otherwise appears to offer no great benefits. (3) The pitchwise variation of absolute flow angle decreases rapidly up to 8% impeller radius ratio and much more slowly thereafter. Although some improvements in prediction of flow field details are realized as a result of analyzing the actual geometry there is no clear consensus that any of the grids investigated produced superior results in every case when compared to the measurements. However, if a multi-block code is available, it should be used, as it has the propensity for enabling better predictions than a single block code.

  15. Investigations on Sawtooth Reconnection in ASDEX Upgrade Tokamak Discharges Using the 3D Non-linear Two-fluid MHD Code M3D-C1

    NASA Astrophysics Data System (ADS)

    Krebs, Isabel; Jardin, Stephen C.; Igochine, Valentin; Guenter, Sibylle; Hoelzl, Matthias; ASDEX Upgrade Team

    2014-10-01

    We study sawtooth reconnection in ASDEX Upgrade tokamak plasmas by means of 3D non-linear two-fluid MHD simulations in toroidal geometry using the high-order finite element code M3D-C1. Parameters and equilibrium of the simulations are based on typical sawtoothing ASDEX Upgrade discharges. The simulation results are compared to features of the experimental observations such as the sawtooth crash time and frequency, the evolution of the safety factor profile and the 3D evolution of the temperature. 2D ECE imaging measurements during sawtooth crashes in ASDEX Upgrade indicate that the heat is transported out of the core through a narrow poloidally localized region. We investigate if incomplete sawtooth reconnection can be seen in the simulations which is suggested by soft X-ray tomography measurements in ASDEX Upgrade showing that an (m = 1, n = 1) perturbation is typically observed to survive the sawtooth crash and approximately maintain its radial position.

  16. Developing Research Skills: Independent Research Projects on Animals and Plants for Building the Research Skills of Report Writing, Mind Mapping, and Investigating through Inquiries. Revised Edition.

    ERIC Educational Resources Information Center

    Banks, Janet Caudill

    This book presents a collection of motivating, independent activities that involve animals and plants for use in developing the research skills of students in grades 2-6. Projects included in the book cover various levels of difficulty and are designed to promote higher-level thinking skills. Research components included in the activities in the…

  17. 'Independence' Panorama

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Click on the image for 'Independence' Panorama (QTVR)

    This is the Spirit 'Independence' panorama, acquired on martian days, or sols, 536 to 543 (July 6 to 13, 2005), from a position in the 'Columbia Hills' near the summit of 'Husband Hill.' The summit of 'Husband Hill' is the peak near the right side of this panorama and is about 100 meters (328 feet) away from the rover and about 30 meters (98 feet) higher in elevation. The rocky outcrops downhill and on the left side of this mosaic include 'Larry's Lookout' and 'Cumberland Ridge,' which Spirit explored in April, May, and June of 2005.

    The panorama spans 360 degrees and consists of 108 individual images, each acquired with five filters of the rover's panoramic camera. The approximate true color of the mosaic was generated using the camera's 750-, 530-, and 480-nanometer filters. During the 8 martian days, or sols, that it took to acquire this image, the lighting varied considerably, partly because of imaging at different times of sol, and partly because of small sol-to-sol variations in the dustiness of the atmosphere. These slight changes produced some image seams and rock shadows. These seams have been eliminated from the sky portion of the mosaic to better simulate the vista a person standing on Mars would see. However, it is often not possible or practical to smooth out such seams for regions of rock, soil, rover tracks or solar panels. Such is the nature of acquiring and assembling large panoramas from the rovers.

  18. An extended version of the SERPENT-2 code to investigate fuel burn-up and core material evolution of the Molten Salt Fast Reactor

    NASA Astrophysics Data System (ADS)

    Aufiero, M.; Cammi, A.; Fiorina, C.; Leppänen, J.; Luzzi, L.; Ricotti, M. E.

    2013-10-01

    In this work, the Monte Carlo burn-up code SERPENT-2 has been extended and employed to study the material isotopic evolution of the Molten Salt Fast Reactor (MSFR). This promising GEN-IV nuclear reactor concept features peculiar characteristics such as the on-line fuel reprocessing, which prevents the use of commonly available burn-up codes. Besides, the presence of circulating nuclear fuel and radioactive streams from the core to the reprocessing plant requires a precise knowledge of the fuel isotopic composition during the plant operation. The developed extension of SERPENT-2 directly takes into account the effects of on-line fuel reprocessing on burn-up calculations and features a reactivity control algorithm. It is here assessed against a dedicated version of the deterministic ERANOS-based EQL3D procedure (PSI-Switzerland) and adopted to analyze the MSFR fuel salt isotopic evolution. Particular attention is devoted to study the effects of reprocessing time constants and efficiencies on the conversion ratio and the molar concentration of elements relevant for solubility issues (e.g., trivalent actinides and lanthanides). Quantities of interest for fuel handling and safety issues are investigated, including decay heat and activities of hazardous isotopes (neutron and high energy gamma emitters) in the core and in the reprocessing stream. The radiotoxicity generation is also analyzed for the MSFR nominal conditions. The production of helium and the depletion in tungsten content due to nuclear reactions are calculated for the nickel-based alloy selected as reactor structural material of the MSFR. These preliminary evaluations can be helpful in studying the radiation damage of both the primary salt container and the axial reflectors.

  19. Speech coding

    NASA Astrophysics Data System (ADS)

    Gersho, Allen

    1990-05-01

    Recent advances in algorithms and techniques for speech coding now permit high quality voice reproduction at remarkably low bit rates. The advent of powerful single-ship signal processors has made it cost effective to implement these new and sophisticated speech coding algorithms for many important applications in voice communication and storage. Some of the main ideas underlying the algorithms of major interest today are reviewed. The concept of removing redundancy by linear prediction is reviewed, first in the context of predictive quantization or DPCM. Then linear predictive coding, adaptive predictive coding, and vector quantization are discussed. The concepts of excitation coding via analysis-by-synthesis, vector sum excitation codebooks, and adaptive postfiltering are explained. The main idea of vector excitation coding (VXC) or code excited linear prediction (CELP) are presented. Finally low-delay VXC coding and phonetic segmentation for VXC are described.

  20. Synthesizing Certified Code

    NASA Technical Reports Server (NTRS)

    Whalen, Michael; Schumann, Johann; Fischer, Bernd

    2002-01-01

    Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.

  1. Uplink Coding

    NASA Technical Reports Server (NTRS)

    Pollara, Fabrizio; Hamkins, Jon; Dolinar, Sam; Andrews, Ken; Divsalar, Dariush

    2006-01-01

    This viewgraph presentation reviews uplink coding. The purpose and goals of the briefing are (1) Show a plan for using uplink coding and describe benefits (2) Define possible solutions and their applicability to different types of uplink, including emergency uplink (3) Concur with our conclusions so we can embark on a plan to use proposed uplink system (4) Identify the need for the development of appropriate technology and infusion in the DSN (5) Gain advocacy to implement uplink coding in flight projects Action Item EMB04-1-14 -- Show a plan for using uplink coding, including showing where it is useful or not (include discussion of emergency uplink coding).

  2. Investigation of plant control strategies for the supercritical C0{sub 2}Brayton cycle for a sodium-cooled fast reactor using the plant dynamics code.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J.

    2011-04-12

    The development of a control strategy for the supercritical CO{sub 2} (S-CO{sub 2}) Brayton cycle has been extended to the investigation of alternate control strategies for a Sodium-Cooled Fast Reactor (SFR) nuclear power plant incorporating a S-CO{sub 2} Brayton cycle power converter. The SFR assumed is the 400 MWe (1000 MWt) ABR-1000 preconceptual design incorporating metallic fuel. Three alternative idealized schemes for controlling the reactor side of the plant in combination with the existing automatic control strategy for the S-CO{sub 2} Brayton cycle are explored using the ANL Plant Dynamics Code together with the SAS4A/SASSYS-1 Liquid Metal Reactor (LMR) Analysis Code System coupled together using the iterative coupling formulation previously developed and implemented into the Plant Dynamics Code. The first option assumes that the reactor side can be ideally controlled through movement of control rods and changing the speeds of both the primary and intermediate coolant system sodium pumps such that the intermediate sodium flow rate and inlet temperature to the sodium-to-CO{sub 2} heat exchanger (RHX) remain unvarying while the intermediate sodium outlet temperature changes as the load demand from the electric grid changes and the S-CO{sub 2} cycle conditions adjust according to the S-CO{sub 2} cycle control strategy. For this option, the reactor plant follows an assumed change in load demand from 100 to 0 % nominal at 5 % reduction per minute in a suitable fashion. The second option allows the reactor core power and primary and intermediate coolant system sodium pump flow rates to change autonomously in response to the strong reactivity feedbacks of the metallic fueled core and assumed constant pump torques representing unchanging output from the pump electric motors. The plant behavior to the assumed load demand reduction is surprising close to that calculated for the first option. The only negative result observed is a slight increase in the intermediate

  3. True uniaxial compressive strengths of rock or coal specimens are independent of diameter-to-length ratios. Report of Investigations/1990

    SciTech Connect

    Babcock, C.O.

    1990-01-01

    Part of the compressive strength of a test specimen of rock or coal in the laboratory or a pillar in a mine comes from physical property strength and, in part, from the constraint provided by the loading stresses. Much confusion in pillar design comes from assigning the total strength change to geometry, as evidenced by the many pillar design equations with width to height as the primary variable. In tests by the U.S. Bureau of Mines, compressive strengths for cylindrical specimens of limestone, marble, sandstone, and coal were independent of the specimen test geometry when the end friction was removed. A conventional uniaxial compressive strength test between two steel platens is actually a uniaxial force and not a uniaxial stress test. The biaxial or triaxial state of stress for much of the test volume changes with the geometry of the test specimen. By removing the end friction supplied by the steel platens to the specimen, a more nearly uniaxial stress state independent of the specimen geometry is produced in the specimen. Pillar design is a constraint and physical property problem rather than a geometry problem. Roof and floor constraint are major factors in pillar design and strength.

  4. An investigative study of multispectral data compression for remotely-sensed images using vector quantization and difference-mapped shift-coding

    NASA Technical Reports Server (NTRS)

    Jaggi, S.

    1993-01-01

    A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.

  5. Scalable video coding in frequency domain

    NASA Astrophysics Data System (ADS)

    Civanlar, Mehmet R.; Puri, Atul

    1992-11-01

    Scalable video coding is important in a number of applications where video needs to be decoded and displayed at a variety of resolution scales. It is more efficient than simulcasting, in which all desired resolution scales are coded totally independent of one another within the constraint of a fixed available bandwidth. In this paper, we focus on scalability using the frequency domain approach. We employ the framework proposed for the ongoing second phase of Motion Picture Experts Group (MPEG-2) standard to study the performance of one such scheme and investigate improvements aimed at increasing its efficiency. Practical issues related to multiplexing of encoded data of various resolution scales to facilitate decoding are considered. Simulations are performed to investigate the potential of a chosen frequency domain scheme. Various prospects and limitations are also discussed.

  6. Extension of the supercritical carbon dioxide brayton cycle to low reactor power operation: investigations using the coupled anl plant dynamics code-SAS4A/SASSYS-1 liquid metal reactor code system.

    SciTech Connect

    Moisseytsev, A.; Sienicki, J. J.

    2012-05-10

    Significant progress has been made on the development of a control strategy for the supercritical carbon dioxide (S-CO{sub 2}) Brayton cycle enabling removal of power from an autonomous load following Sodium-Cooled Fast Reactor (SFR) down to decay heat levels such that the S-CO{sub 2} cycle can be used to cool the reactor until decay heat can be removed by the normal shutdown heat removal system or a passive decay heat removal system such as Direct Reactor Auxiliary Cooling System (DRACS) loops with DRACS in-vessel heat exchangers. This capability of the new control strategy eliminates the need for use of a separate shutdown heat removal system which might also use supercritical CO{sub 2}. It has been found that this capability can be achieved by introducing a new control mechanism involving shaft speed control for the common shaft joining the turbine and two compressors following reduction of the load demand from the electrical grid to zero. Following disconnection of the generator from the electrical grid, heat is removed from the intermediate sodium circuit through the sodium-to-CO{sub 2} heat exchanger, the turbine solely drives the two compressors, and heat is rejected from the cycle through the CO{sub 2}-to-water cooler. To investigate the effectiveness of shaft speed control, calculations are carried out using the coupled Plant Dynamics Code-SAS4A/SASSYS-1 code for a linear load reduction transient for a 1000 MWt metallic-fueled SFR with autonomous load following. No deliberate motion of control rods or adjustment of sodium pump speeds is assumed to take place. It is assumed that the S-CO{sub 2} turbomachinery shaft speed linearly decreases from 100 to 20% nominal following reduction of grid load to zero. The reactor power is calculated to autonomously decrease down to 3% nominal providing a lengthy window in time for the switchover to the normal shutdown heat removal system or for a passive decay heat removal system to become effective. However, the

  7. Phonological coding during reading

    PubMed Central

    Leinenger, Mallorie

    2014-01-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early (pre-lexical) or that phonological codes come online late (post-lexical)) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eyetracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model (Van Order, 1987), dual-route model (e.g., Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001), parallel distributed processing model (Seidenberg & McClelland, 1989)) are discussed. PMID:25150679

  8. Phonological coding during reading.

    PubMed

    Leinenger, Mallorie

    2014-11-01

    The exact role that phonological coding (the recoding of written, orthographic information into a sound based code) plays during silent reading has been extensively studied for more than a century. Despite the large body of research surrounding the topic, varying theories as to the time course and function of this recoding still exist. The present review synthesizes this body of research, addressing the topics of time course and function in tandem. The varying theories surrounding the function of phonological coding (e.g., that phonological codes aid lexical access, that phonological codes aid comprehension and bolster short-term memory, or that phonological codes are largely epiphenomenal in skilled readers) are first outlined, and the time courses that each maps onto (e.g., that phonological codes come online early [prelexical] or that phonological codes come online late [postlexical]) are discussed. Next the research relevant to each of these proposed functions is reviewed, discussing the varying methodologies that have been used to investigate phonological coding (e.g., response time methods, reading while eye-tracking or recording EEG and MEG, concurrent articulation) and highlighting the advantages and limitations of each with respect to the study of phonological coding. In response to the view that phonological coding is largely epiphenomenal in skilled readers, research on the use of phonological codes in prelingually, profoundly deaf readers is reviewed. Finally, implications for current models of word identification (activation-verification model, Van Orden, 1987; dual-route model, e.g., M. Coltheart, Rastle, Perry, Langdon, & Ziegler, 2001; parallel distributed processing model, Seidenberg & McClelland, 1989) are discussed. PMID:25150679

  9. Contemporary accuracy of death certificates for coding prostate cancer as a cause of death: Is reliance on death certification good enough? A comparison with blinded review by an independent cause of death evaluation committee

    PubMed Central

    Turner, Emma L; Metcalfe, Chris; Donovan, Jenny L; Noble, Sian; Sterne, Jonathan A C; Lane, J Athene; I Walsh, Eleanor; Hill, Elizabeth M; Down, Liz; Ben-Shlomo, Yoav; Oliver, Steven E; Evans, Simon; Brindle, Peter; Williams, Naomi J; Hughes, Laura J; Davies, Charlotte F; Ng, Siaw Yein; Neal, David E; Hamdy, Freddie C; Albertsen, Peter; Reid, Colette M; Oxley, Jon; McFarlane, John; Robinson, Mary C; Adolfsson, Jan; Zietman, Anthony; Baum, Michael; Koupparis, Anthony; Martin, Richard M

    2016-01-01

    Background: Accurate cause of death assignment is crucial for prostate cancer epidemiology and trials reporting prostate cancer-specific mortality outcomes. Methods: We compared death certificate information with independent cause of death evaluation by an expert committee within a prostate cancer trial (2002–2015). Results: Of 1236 deaths assessed, expert committee evaluation attributed 523 (42%) to prostate cancer, agreeing with death certificate cause of death in 1134 cases (92%, 95% CI: 90%, 93%). The sensitivity of death certificates in identifying prostate cancer deaths as classified by the committee was 91% (95% CI: 89%, 94%); specificity was 92% (95% CI: 90%, 94%). Sensitivity and specificity were lower where death occurred within 1 year of diagnosis, and where there was another primary cancer diagnosis. Conclusions: UK death certificates accurately identify cause of death in men with prostate cancer, supporting their use in routine statistics. Possible differential misattribution by trial arm supports independent evaluation in randomised trials. PMID:27253172

  10. Material-dependent and material-independent selection processes in the frontal and parietal lobes: an event-related fMRI investigation of response competition

    NASA Technical Reports Server (NTRS)

    Hazeltine, Eliot; Bunge, Silvia A.; Scanlon, Michael D.; Gabrieli, John D E.

    2003-01-01

    The present study used the flanker task [Percept. Psychophys. 16 (1974) 143] to identify neural structures that support response selection processes, and to determine which of these structures respond differently depending on the type of stimulus material associated with the response. Participants performed two versions of the flanker task while undergoing event-related functional magnetic resonance imaging (fMRI). Both versions of the task required participants to respond to a central stimulus regardless of the responses associated with simultaneously presented flanking stimuli, but one used colored circle stimuli and the other used letter stimuli. Competition-related activation was identified by comparing Incongruent trials, in which the flanker stimuli indicated a different response than the central stimulus, to Neutral stimuli, in which the flanker stimuli indicated no response. A region within the right inferior frontal gyrus exhibited significantly more competition-related activation for the color stimuli, whereas regions within the middle frontal gyri of both hemispheres exhibited more competition-related activation for the letter stimuli. The border of the right middle frontal and inferior frontal gyri and the anterior cingulate cortex (ACC) were significantly activated by competition for both types of stimulus materials. Posterior foci demonstrated a similar pattern: left inferior parietal cortex showed greater competition-related activation for the letters, whereas right parietal cortex was significantly activated by competition for both materials. These findings indicate that the resolution of response competition invokes both material-dependent and material-independent processes.

  11. Computer Code

    NASA Technical Reports Server (NTRS)

    1985-01-01

    COSMIC MINIVER, a computer code developed by NASA for analyzing aerodynamic heating and heat transfer on the Space Shuttle, has been used by Marquardt Company to analyze heat transfer on Navy/Air Force missile bodies. The code analyzes heat transfer by four different methods which can be compared for accuracy. MINIVER saved Marquardt three months in computer time and $15,000.

  12. Is ADHD a Risk Factor Independent of Conduct Disorder for Illicit Substance Use? A Meta-Analysis and Meta-Regression Investigation

    ERIC Educational Resources Information Center

    Serra-Pinheiro, Maria Antonia; Coutinho, Evandro S. F.; Souza, Isabella S.; Pinna, Camilla; Fortes, Didia; Araujo, Catia; Szobot, Claudia M.; Rohde, Luis A.; Mattos, Paulo

    2013-01-01

    Objective: To investigate meta-analytically if the association between ADHD and illicit substance use (ISU) is maintained when controlling for conduct disorder/oppositional-defiant disorder (CD/ODD). Method: A systematic literature review was conducted through Medline from 1980 to 2008. Data extracted and selections made by one author were…

  13. An Evaluation of Two Different Methods of Assessing Independent Investigations in an Operational Pre-University Level Examination in Biology in England.

    ERIC Educational Resources Information Center

    Brown, Chris

    1998-01-01

    Explored aspects of assessment of extended investigation ("project") practiced in the operational examinations of The University of Cambridge Local Examinations Syndicate (UCLES) for the perspective of construct validity. Samples of the 1993 (n=333) and 1996 (n=259) biology test results reveal two methods of assessing the project. (MAK)

  14. A dynamic population model to investigate effects of climate and climate-independent factors on the lifecycle of the tick Amblyomma americanum (Acari: Ixodidae)

    USGS Publications Warehouse

    Ludwig, Antoinette; Ginsberg, Howard; Hickling, Graham J.; Ogden, Nicholas H.

    2015-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick.

  15. A Dynamic Population Model to Investigate Effects of Climate and Climate-Independent Factors on the Lifecycle of Amblyomma americanum (Acari: Ixodidae).

    PubMed

    Ludwig, Antoinette; Ginsberg, Howard S; Hickling, Graham J; Ogden, Nicholas H

    2016-01-01

    The lone star tick, Amblyomma americanum, is a disease vector of significance for human and animal health throughout much of the eastern United States. To model the potential effects of climate change on this tick, a better understanding is needed of the relative roles of temperature-dependent and temperature-independent (day-length-dependent behavioral or morphogenetic diapause) processes acting on the tick lifecycle. In this study, we explored the roles of these processes by simulating seasonal activity patterns using models with site-specific temperature and day-length-dependent processes. We first modeled the transitions from engorged larvae to feeding nymphs, engorged nymphs to feeding adults, and engorged adult females to feeding larvae. The simulated seasonal patterns were compared against field observations at three locations in United States. Simulations suggested that 1) during the larva-to-nymph transition, some larvae undergo no diapause while others undergo morphogenetic diapause of engorged larvae; 2) molted adults undergo behavioral diapause during the transition from nymph-to-adult; and 3) there is no diapause during the adult-to-larva transition. A model constructed to simulate the full lifecycle of A. americanum successfully predicted observed tick activity at the three U.S. study locations. Some differences between observed and simulated seasonality patterns were observed, however, identifying the need for research to refine some model parameters. In simulations run using temperature data for Montreal, deterministic die-out of A. americanum populations did not occur, suggesting the possibility that current climate in parts of southern Canada is suitable for survival and reproduction of this tick. PMID:26502753

  16. Suboptimum decoding of block codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao

    1991-01-01

    This paper investigates a class of decomposable codes, their distance and structural properties. it is shown that this class includes several classes of well known and efficient codes as subclasses. Several methods for constructing decomposable codes or decomposing codes are presented. A two-stage soft decision decoding scheme for decomposable codes, their translates or unions of translates is devised. This two-stage soft-decision decoding is suboptimum, and provides an excellent trade-off between the error performance and decoding complexity for codes of moderate and long block length.

  17. Functional Investigation of a Non-coding Variant Associated with Adolescent Idiopathic Scoliosis in Zebrafish: Elevated Expression of the Ladybird Homeobox Gene Causes Body Axis Deformation

    PubMed Central

    Guo, Long; Yamashita, Hiroshi; Kou, Ikuyo; Takimoto, Aki; Meguro-Horike, Makiko; Horike, Shin-ichi; Sakuma, Tetsushi; Miura, Shigenori; Adachi, Taiji; Yamamoto, Takashi; Ikegawa, Shiro; Hiraki, Yuji; Shukunami, Chisa

    2016-01-01

    Previously, we identified an adolescent idiopathic scoliosis susceptibility locus near human ladybird homeobox 1 (LBX1) and FLJ41350 by a genome-wide association study. Here, we characterized the associated non-coding variant and investigated the function of these genes. A chromosome conformation capture assay revealed that the genome region with the most significantly associated single nucleotide polymorphism (rs11190870) physically interacted with the promoter region of LBX1-FLJ41350. The promoter in the direction of LBX1, combined with a 590-bp region including rs11190870, had higher transcriptional activity with the risk allele than that with the non-risk allele in HEK 293T cells. The ubiquitous overexpression of human LBX1 or either of the zebrafish lbx genes (lbx1a, lbx1b, and lbx2), but not FLJ41350, in zebrafish embryos caused body curvature followed by death prior to vertebral column formation. Such body axis deformation was not observed in transcription activator-like effector nucleases mediated knockout zebrafish of lbx1b or lbx2. Mosaic expression of lbx1b driven by the GATA2 minimal promoter and the lbx1b enhancer in zebrafish significantly alleviated the embryonic lethal phenotype to allow observation of the later onset of the spinal curvature with or without vertebral malformation. Deformation of the embryonic body axis by lbx1b overexpression was associated with defects in convergent extension, which is a component of the main axis-elongation machinery in gastrulating embryos. In embryos overexpressing lbx1b, wnt5b, a ligand of the non-canonical Wnt/planar cell polarity (PCP) pathway, was significantly downregulated. Injection of mRNA for wnt5b or RhoA, a key downstream effector of Wnt/PCP signaling, rescued the defective convergent extension phenotype and attenuated the lbx1b-induced curvature of the body axis. Thus, our study presents a novel pathological feature of LBX1 and its zebrafish homologs in body axis deformation at various stages of

  18. Independence Generalizing Monotone and Boolean Independences

    NASA Astrophysics Data System (ADS)

    Hasebe, Takahiro

    2011-01-01

    We define conditionally monotone independence in two states which interpolates monotone and Boolean ones. This independence is associative, and therefore leads to a natural probability theory in a non-commutative algebra.

  19. Coded aperture computed tomography

    NASA Astrophysics Data System (ADS)

    Choi, Kerkil; Brady, David J.

    2009-08-01

    Diverse physical measurements can be modeled by X-ray transforms. While X-ray tomography is the canonical example, reference structure tomography (RST) and coded aperture snapshot spectral imaging (CASSI) are examples of physically unrelated but mathematically equivalent sensor systems. Historically, most x-ray transform based systems sample continuous distributions and apply analytical inversion processes. On the other hand, RST and CASSI generate discrete multiplexed measurements implemented with coded apertures. This multiplexing of coded measurements allows for compression of measurements from a compressed sensing perspective. Compressed sensing (CS) is a revelation that if the object has a sparse representation in some basis, then a certain number, but typically much less than what is prescribed by Shannon's sampling rate, of random projections captures enough information for a highly accurate reconstruction of the object. This paper investigates the role of coded apertures in x-ray transform measurement systems (XTMs) in terms of data efficiency and reconstruction fidelity from a CS perspective. To conduct this, we construct a unified analysis using RST and CASSI measurement models. Also, we propose a novel compressive x-ray tomography measurement scheme which also exploits coding and multiplexing, and hence shares the analysis of the other two XTMs. Using this analysis, we perform a qualitative study on how coded apertures can be exploited to implement physical random projections by "regularizing" the measurement systems. Numerical studies and simulation results demonstrate several examples of the impact of coding.

  20. Coding for reliable satellite communications

    NASA Technical Reports Server (NTRS)

    Gaarder, N. T.; Lin, S.

    1986-01-01

    This research project was set up to study various kinds of coding techniques for error control in satellite and space communications for NASA Goddard Space Flight Center. During the project period, researchers investigated the following areas: (1) decoding of Reed-Solomon codes in terms of dual basis; (2) concatenated and cascaded error control coding schemes for satellite and space communications; (3) use of hybrid coding schemes (error correction and detection incorporated with retransmission) to improve system reliability and throughput in satellite communications; (4) good codes for simultaneous error correction and error detection, and (5) error control techniques for ring and star networks.

  1. QR Codes

    ERIC Educational Resources Information Center

    Lai, Hsin-Chih; Chang, Chun-Yen; Li, Wen-Shiane; Fan, Yu-Lin; Wu, Ying-Tien

    2013-01-01

    This study presents an m-learning method that incorporates Integrated Quick Response (QR) codes. This learning method not only achieves the objectives of outdoor education, but it also increases applications of Cognitive Theory of Multimedia Learning (CTML) (Mayer, 2001) in m-learning for practical use in a diverse range of outdoor locations. When…

  2. Getting Students to be Successful, Independent Investigators

    ERIC Educational Resources Information Center

    Thomas, Jeffrey D.

    2010-01-01

    Middle school students often struggle when writing testable problems, planning valid and reliable procedures, and drawing meaningful evidence-based conclusions. To address this issue, the author created a student-centered lab handout to facilitate the inquiry process for students. This handout has reduced students' frustration and helped them…

  3. Independent Peer Reviews

    SciTech Connect

    2012-03-16

    Independent Assessments: DOE's Systems Integrator convenes independent technical reviews to gauge progress toward meeting specific technical targets and to provide technical information necessary for key decisions.

  4. 32 CFR 635.19 - Offense codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Offense codes. 635.19 Section 635.19 National... INVESTIGATIONS LAW ENFORCEMENT REPORTING Offense Reporting § 635.19 Offense codes. (a) The offense code describes, as nearly as possible, the complaint or offense by using an alphanumeric code. Appendix C of AR...

  5. Cyclic unequal error protection codes constructed from cyclic codes of composite length

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1987-01-01

    The distance structure of cyclic codes of composite length was investigated. A lower bound on the minimum distance for this class of codes is derived. In many cases, the lower bound gives the true minimum distance of a code. Then the distance structure of the direct sum of two cyclic codes of composite length were investigated. It was shown that, under certain conditions, the direct-sum code provides two levels of error correcting capability, and hence is a two-level unequal error protection (UEP) code. Finally, a class of two-level UEP cyclic direct-sum codes and a decoding algorithm for a subclass of these codes are presented.

  6. Covariance Matrix Evaluations for Independent Mass Fission Yields

    NASA Astrophysics Data System (ADS)

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.; Sumini, M.

    2015-01-01

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of 235U(nth, f) and 239Pu(nth, f) reactions.

  7. Covariance Matrix Evaluations for Independent Mass Fission Yields

    SciTech Connect

    Terranova, N.; Serot, O.; Archier, P.; De Saint Jean, C.

    2015-01-15

    Recent needs for more accurate fission product yields include covariance information to allow improved uncertainty estimations of the parameters used by design codes. The aim of this work is to investigate the possibility to generate more reliable and complete uncertainty information on independent mass fission yields. Mass yields covariances are estimated through a convolution between the multi-Gaussian empirical model based on Brosa's fission modes, which describe the pre-neutron mass yields, and the average prompt neutron multiplicity curve. The covariance generation task has been approached using the Bayesian generalized least squared method through the CONRAD code. Preliminary results on mass yields variance-covariance matrix will be presented and discussed from physical grounds in the case of {sup 235}U(n{sub th}, f) and {sup 239}Pu(n{sub th}, f) reactions.

  8. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    SciTech Connect

    Surkov, A. V. Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-15

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of {sup 235}U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the {sup 54}Fe (n, p) and {sup 58}Ni (n, p) reactions. The {sup 235}U fission rate was measured using uranium dioxide with 10% enrichment in {sup 235}U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  9. Experimental investigation of neutronic characteristics of the IR-8 reactor to confirm the results of calculations by MCU-PTR code

    NASA Astrophysics Data System (ADS)

    Surkov, A. V.; Kochkin, V. N.; Pesnya, Yu. E.; Nasonov, V. A.; Vihrov, V. I.; Erak, D. Yu.

    2015-12-01

    A comparison of measured and calculated neutronic characteristics (fast neutron flux and fission rate of 235U) in the core and reflector of the IR-8 reactor is presented. The irradiation devices equipped with neutron activation detectors were prepared. The determination of fast neutron flux was performed using the 54Fe ( n, p) and 58Ni ( n, p) reactions. The 235U fission rate was measured using uranium dioxide with 10% enrichment in 235U. The determination of specific activities of detectors was carried out by measuring the intensity of characteristic gamma peaks using the ORTEC gamma spectrometer. Neutron fields in the core and reflector of the IR-8 reactor were calculated using the MCU-PTR code.

  10. Multiple wavelet-tree-based image coding and robust transmission

    NASA Astrophysics Data System (ADS)

    Cao, Lei; Chen, Chang Wen

    2004-10-01

    In this paper, we present techniques based on multiple wavelet-tree coding for robust image transmission. The algorithm of set partitioning in hierarchical trees (SPIHT) is a state-of-the-art technique for image compression. This variable length coding (VLC) technique, however, is extremely sensitive to channel errors. To improve the error resilience capability and in the meantime to keep the high source coding efficiency through VLC, we propose to encode each wavelet tree or a group of wavelet trees using SPIHT algorithm independently. Instead of encoding the entire image as one bitstream, multiple bitstreams are generated. Therefore, error propagation is limited within individual bitstream. Two methods based on subsampling and human visual sensitivity are proposed to group the wavelet trees. The multiple bitstreams are further protected by the rate compatible puncture convolutional (RCPC) codes. Unequal error protection are provided for both different bitstreams and different bit segments inside each bitstream. We also investigate the improvement of error resilience through error resilient entropy coding (EREC) and wavelet tree coding when channels are slightly corruptive. A simple post-processing technique is also proposed to alleviate the effect of residual errors. We demonstrate through simulations that systems with these techniques can achieve much better performance than systems transmitting a single bitstream in noisy environments.

  11. Seals Code Development Workshop

    NASA Technical Reports Server (NTRS)

    Hendricks, Robert C. (Compiler); Liang, Anita D. (Compiler)

    1996-01-01

    Seals Workshop of 1995 industrial code (INDSEAL) release include ICYL, GCYLT, IFACE, GFACE, SPIRALG, SPIRALI, DYSEAL, and KTK. The scientific code (SCISEAL) release includes conjugate heat transfer and multidomain with rotordynamic capability. Several seals and bearings codes (e.g., HYDROFLEX, HYDROTRAN, HYDROB3D, FLOWCON1, FLOWCON2) are presented and results compared. Current computational and experimental emphasis includes multiple connected cavity flows with goals of reducing parasitic losses and gas ingestion. Labyrinth seals continue to play a significant role in sealing with face, honeycomb, and new sealing concepts under investigation for advanced engine concepts in view of strict environmental constraints. The clean sheet approach to engine design is advocated with program directions and anticipated percentage SFC reductions cited. Future activities center on engine applications with coupled seal/power/secondary flow streams.

  12. Independent Schools - Independent Thinking - Independent Art: Testing Assumptions.

    ERIC Educational Resources Information Center

    Carnes, Virginia

    This study consists of a review of selected educational reform issues from the past 10 years that deal with changing attitudes towards art and art instruction in the context of independent private sector schools. The major focus of the study is in visual arts and examines various programs and initiatives with an art focus. Programs include…

  13. Implementation issues in source coding

    NASA Technical Reports Server (NTRS)

    Sayood, Khalid; Chen, Yun-Chung; Hadenfeldt, A. C.

    1989-01-01

    An edge preserving image coding scheme which can be operated in both a lossy and a lossless manner was developed. The technique is an extension of the lossless encoding algorithm developed for the Mars observer spectral data. It can also be viewed as a modification of the DPCM algorithm. A packet video simulator was also developed from an existing modified packet network simulator. The coding scheme for this system is a modification of the mixture block coding (MBC) scheme described in the last report. Coding algorithms for packet video were also investigated.

  14. Parallelization of the SIR code

    NASA Astrophysics Data System (ADS)

    Thonhofer, S.; Bellot Rubio, L. R.; Utz, D.; Jurčak, J.; Hanslmeier, A.; Piantschitsch, I.; Pauritsch, J.; Lemmerer, B.; Guttenbrunner, S.

    A high-resolution 3-dimensional model of the photospheric magnetic field is essential for the investigation of small-scale solar magnetic phenomena. The SIR code is an advanced Stokes-inversion code that deduces physical quantities, e.g. magnetic field vector, temperature, and LOS velocity, from spectropolarimetric data. We extended this code by the capability of directly using large data sets and inverting the pixels in parallel. Due to this parallelization it is now feasible to apply the code directly on extensive data sets. Besides, we included the possibility to use different initial model atmospheres for the inversion, which enhances the quality of the results.

  15. The Integrated TIGER Series Codes

    Energy Science and Technology Software Center (ESTSC)

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with anmore » input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.« less

  16. The Integrated TIGER Series Codes

    SciTech Connect

    Kensek, Ronald P.; Franke, Brian C.; Laub, Thomas W.

    2006-01-15

    ITS is a powerful and user-friendly software package permitting state-of-the-art Monte Carlo solution of linear time-independent coupled electron/photon radiation transport problems, with or without the presence of macroscopic electric and magnetic fields of arbitrary spatial dependence. Our goal has been to simultaneously maximize operational simplicity and physical accuracy. Through a set of preprocessor directives, the user selects one of the many ITS codes. The ease with which the makefile system is applied combines with an input scheme based on order-independent descriptive keywords that makes maximum use of defaults and intemal error checking to provide experimentalists and theorists alike with a method for the routine but rigorous solution of sophisticated radiation transport problems. Physical rigor is provided by employing accurate cross sections, sampling distributions, and physical models for describing the production and transport of the electron/photon cascade from 1.0 GeV down to 1.0 keV. The availability of source code permits the more sophisticated user to tailor the codes to specific applications and to extend the capabilities of the codes to more complex applications. Version 5.0, the latest version of ITS, contains (1) improvements to the ITS 3.0 continuous-energy codes, (2) multigroup codes with adjoint transport capabilities, (3) parallel implementations of all ITS codes, (4) a general purpose geometry engine for linking with CAD or other geometry formats, and (5) the Cholla facet geometry library. Moreover, the general user friendliness of the software has been enhanced through increased internal error checking and improved code portability.

  17. A preliminary investigation of Large Eddy Simulation (LES) of the flow around a cylinder at ReD = 3900 using a commercial CFD code

    SciTech Connect

    Paschkewitz, J S

    2006-02-14

    Engineering fluid mechanics simulations at high Reynolds numbers have traditionally been performed using the Reynolds-Averaged Navier Stokes (RANS) equations and a turbulence model. The RANS methodology has well-documented shortcomings in the modeling of separated or bluff body wake flows that are characterized by unsteady vortex shedding. The resulting turbulence statistics are strongly influenced by the detailed structure and dynamics of the large eddies, which are poorly captured using RANS models (Rodi 1997; Krishnan et al. 2004). The Large Eddy Simulation (LES) methodology offers the potential to more accurately simulate these flows as it resolves the large-scale unsteady motions and entails modeling of only the smallest-scale turbulence structures. Commercial computational fluid dynamics products are beginning to offer LES capability, allowing practicing engineers an opportunity to apply this turbulence modeling technique to much wider array of problems than in dedicated research codes. Here, we present a preliminary evaluation of the LES capability in the commercial CFD solver StarCD by simulating the flow around a cylinder at a Reynolds number based on the cylinder diameter, D, of 3900 using the constant coefficient Smagorinsky LES model. The results are compared to both the experimental and computational results provided in Kravchenko & Moin (2000). We find that StarCD provides predictions of lift and drag coefficients that are within 15% of the experimental values. Reasonable agreement is obtained between the time-averaged velocity statistics and the published data. The differences in these metrics may be due to the use of a truncated domain in the spanwise direction and the short time-averaging period used for the statistics presented here. The instantaneous flow field visualizations show a coarser, larger-scale structure than the study of Kravchenko & Moin (2000), which may be a product of the LES implementation or of the domain and resolution used

  18. Content Independence in Multimedia Databases.

    ERIC Educational Resources Information Center

    de Vries, Arjen P.

    2001-01-01

    Investigates the role of data management in multimedia digital libraries, and its implications for the design of database management systems. Introduces the notions of content abstraction and content independence. Proposes a blueprint of a new class of database technology, which supports the basic functionality for the management of both content…

  19. Codes with special correlation.

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.

    1964-01-01

    Uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets uniform binary codes with special correlation including transorthogonality and simplex code, Hadamard matrices and difference sets

  20. CONTAIN independent peer review

    SciTech Connect

    Boyack, B.E.; Corradini, M.L.; Denning, R.S.; Khatib-Rahbar, M.; Loyalka, S.K.; Smith, P.N.

    1995-01-01

    The CONTAIN code was developed by Sandia National Laboratories under the sponsorship of the US Nuclear Regulatory Commission (NRC) to provide integrated analyses of containment phenomena. It is used to predict nuclear reactor containment loads, radiological source terms, and associated physical phenomena for a range of accident conditions encompassing both design-basis and severe accidents. The code`s targeted applications include support for containment-related experimental programs, light water and advanced light water reactor plant analysis, and analytical support for resolution of specific technical issues such as direct containment heating. The NRC decided that a broad technical review of the code should be performed by technical experts to determine its overall technical adequacy. For this purpose, a six-member CONTAIN Peer Review Committee was organized and a peer review as conducted. While the review was in progress, the NRC issued a draft ``Revised Severe Accident Code Strategy`` that incorporated revised design objectives and targeted applications for the CONTAIN code. The committee continued its effort to develop findings relative to the original NRC statement of design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications. However, the revised CONTAIN design objectives and targeted applications were considered by the Committee in assigning priorities to the Committee`s recommendations. The Committee determined some improvements are warranted and provided recommendations in five code-related areas: (1) documentation, (2) user guidance, (3) modeling capability, (4) code assessment, and (5) technical assessment.

  1. Coding Long Contour Shapes of Binary Objects

    NASA Astrophysics Data System (ADS)

    Sánchez-Cruz, Hermilo; Rodríguez-Díaz, Mario A.

    This is an extension of the paper appeared in [15]. This time, we compare four methods: Arithmetic coding applied to 3OT chain code (Arith-3OT), Arithmetic coding applied to DFCCE (Arith-DFCCE), Huffman coding applied to DFCCE chain code (Huff-DFCCE), and, to measure the efficiency of the chain codes, we propose to compare the methods with JBIG, which constitutes an international standard. In the aim to look for a suitable and better representation of contour shapes, our probes suggest that a sound method to represent contour shapes is 3OT, because Arithmetic coding applied to it gives the best results regarding JBIG, independently of the perimeter of the contour shapes.

  2. On the error probability of general tree and trellis codes with applications to sequential decoding

    NASA Technical Reports Server (NTRS)

    Johannesson, R.

    1973-01-01

    An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random binary tree codes is derived and shown to be independent of the length of the tree. An upper bound on the average error probability for maximum-likelihood decoding of the ensemble of random L-branch binary trellis codes of rate R = 1/n is derived which separates the effects of the tail length T and the memory length M of the code. It is shown that the bound is independent of the length L of the information sequence. This implication is investigated by computer simulations of sequential decoding utilizing the stack algorithm. These simulations confirm the implication and further suggest an empirical formula for the true undetected decoding error probability with sequential decoding.

  3. Heuristic dynamic complexity coding

    NASA Astrophysics Data System (ADS)

    Škorupa, Jozef; Slowack, Jürgen; Mys, Stefaan; Lambert, Peter; Van de Walle, Rik

    2008-04-01

    Distributed video coding is a new video coding paradigm that shifts the computational intensive motion estimation from encoder to decoder. This results in a lightweight encoder and a complex decoder, as opposed to the predictive video coding scheme (e.g., MPEG-X and H.26X) with a complex encoder and a lightweight decoder. Both schemas, however, do not have the ability to adapt to varying complexity constraints imposed by encoder and decoder, which is an essential ability for applications targeting a wide range of devices with different complexity constraints or applications with temporary variable complexity constraints. Moreover, the effect of complexity adaptation on the overall compression performance is of great importance and has not yet been investigated. To address this need, we have developed a video coding system with the possibility to adapt itself to complexity constraints by dynamically sharing the motion estimation computations between both components. On this system we have studied the effect of the complexity distribution on the compression performance. This paper describes how motion estimation can be shared using heuristic dynamic complexity and how distribution of complexity affects the overall compression performance of the system. The results show that the complexity can indeed be shared between encoder and decoder in an efficient way at acceptable rate-distortion performance.

  4. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2011-10-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. This paper summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30{sup o} of yaw.

  5. Investigating the Influence of the Added Mass Effect to Marine Hydrokinetic Horizontal-Axis Turbines Using a General Dynamic Wake Wind Turbine Code: Preprint

    SciTech Connect

    Maniaci, D. C.; Li, Y.

    2012-04-01

    This paper describes a recent study to investigate the applicability of a horizontal-axis wind turbine (HAWT) structural dynamics and unsteady aerodynamics analysis program (FAST and AeroDyn respectively) to modeling the forces on marine hydrokinetic (MHK) turbines. It summarizes the added mass model that has been added to AeroDyn. The added mass model only includes flow acceleration perpendicular to the rotor disc, and ignores added mass forces caused by blade deflection. A model of the National Renewable Energy Laboratory's (NREL) Unsteady Aerodynamics Experiment (UAE) Phase VI wind turbine was analyzed using FAST and AeroDyn with sea water conditions and the new added mass model. The results of this analysis exhibited a 3.6% change in thrust for a rapid pitch case and a slight change in amplitude and phase of thrust for a case with 30 degrees of yaw.

  6. Production code control system for hydrodynamics simulations

    SciTech Connect

    Slone, D.M.

    1997-08-18

    We describe how the Production Code Control System (pCCS), written in Perl, has been used to control and monitor the execution of a large hydrodynamics simulation code in a production environment. We have been able to integrate new, disparate, and often independent, applications into the PCCS framework without the need to modify any of our existing application codes. Both users and code developers see a consistent interface to the simulation code and associated applications regardless of the physical platform, whether an MPP, SMP, server, or desktop workstation. We will also describe our use of Perl to develop a configuration management system for the simulation code, as well as a code usage database and report generator. We used Perl to write a backplane that allows us plug in preprocessors, the hydrocode, postprocessors, visualization tools, persistent storage requests, and other codes. We need only teach PCCS a minimal amount about any new tool or code to essentially plug it in and make it usable to the hydrocode. PCCS has made it easier to link together disparate codes, since using Perl has removed the need to learn the idiosyncrasies of system or RPC programming. The text handling in Perl makes it easy to teach PCCS about new codes, or changes to existing codes.

  7. Longwave infrared (LWIR) coded aperture dispersive spectrometer.

    PubMed

    Fernandez, C; Guenther, B D; Gehm, M E; Brady, D J; Sullivan, M E

    2007-04-30

    We describe a static aperture-coded, dispersive longwave infrared (LWIR) spectrometer that uses a microbolometer array at the detector plane. The two-dimensional aperture code is based on a row-doubled Hadamard mask with transmissive and opaque openings. The independent column code nature of the matrix makes for a mathematically well-defined pattern that spatially and spectrally maps the source information to the detector plane. Post-processing techniques on the data provide spectral estimates of the source. Comparative experimental results between a slit and coded aperture for emission spectroscopy from a CO(2) laser are demonstrated. PMID:19532832

  8. Coding in pigeons: Multiple-coding versus single-code/default strategies.

    PubMed

    Pinto, Carlos; Machado, Armando

    2015-05-01

    To investigate the coding strategies that pigeons may use in a temporal discrimination tasks, pigeons were trained on a matching-to-sample procedure with three sample durations (2s, 6s and 18s) and two comparisons (red and green hues). One comparison was correct following 2-s samples and the other was correct following both 6-s and 18-s samples. Tests were then run to contrast the predictions of two hypotheses concerning the pigeons' coding strategies, the multiple-coding and the single-code/default. According to the multiple-coding hypothesis, three response rules are acquired, one for each sample. According to the single-code/default hypothesis, only two response rules are acquired, one for the 2-s sample and a "default" rule for any other duration. In retention interval tests, pigeons preferred the "default" key, a result predicted by the single-code/default hypothesis. In no-sample tests, pigeons preferred the key associated with the 2-s sample, a result predicted by multiple-coding. Finally, in generalization tests, when the sample duration equaled 3.5s, the geometric mean of 2s and 6s, pigeons preferred the key associated with the 6-s and 18-s samples, a result predicted by the single-code/default hypothesis. The pattern of results suggests the need for models that take into account multiple sources of stimulus control. PMID:25894105

  9. Investigating the impact of parental status and depression symptoms on the early perceptual coding of infant faces: an event-related potential study.

    PubMed

    Noll, Laura K; Mayes, Linda C; Rutherford, Helena J V

    2012-01-01

    Infant faces are highly salient social stimuli that appear to elicit intuitive parenting behaviors in healthy adult women. Behavioral and observational studies indicate that this effect may be modulated by experiences of reproduction, caregiving, and psychiatric symptomatology that affect normative attention and reward processing of infant cues. However, relatively little is known about the neural correlates of these effects. Using the event-related potential (ERP) technique, this study investigated the impact of parental status (mother, non-mother) and depression symptoms on early visual processing of infant faces in a community sample of adult women. Specifically, the P1 and N170 ERP components elicited in response to infant face stimuli were examined. While characteristics of the N170 were not modulated by parental status, a statistically significant positive correlation was observed between depression symptom severity and N170 amplitude. This relationship was not observed for the P1. These results suggest that depression symptoms may modulate early neurophysiological responsiveness to infant cues, even at sub-clinical levels. PMID:22435403

  10. An Efficient Variable Length Coding Scheme for an IID Source

    NASA Technical Reports Server (NTRS)

    Cheung, K. -M.

    1995-01-01

    A scheme is examined for using two alternating Huffman codes to encode a discrete independent and identically distributed source with a dominant symbol. This combined strategy, or alternating runlength Huffman (ARH) coding, was found to be more efficient than ordinary coding in certain circumstances.

  11. Homological stabilizer codes

    SciTech Connect

    Anderson, Jonas T.

    2013-03-15

    In this paper we define homological stabilizer codes on qubits which encompass codes such as Kitaev's toric code and the topological color codes. These codes are defined solely by the graphs they reside on. This feature allows us to use properties of topological graph theory to determine the graphs which are suitable as homological stabilizer codes. We then show that all toric codes are equivalent to homological stabilizer codes on 4-valent graphs. We show that the topological color codes and toric codes correspond to two distinct classes of graphs. We define the notion of label set equivalencies and show that under a small set of constraints the only homological stabilizer codes without local logical operators are equivalent to Kitaev's toric code or to the topological color codes. - Highlights: Black-Right-Pointing-Pointer We show that Kitaev's toric codes are equivalent to homological stabilizer codes on 4-valent graphs. Black-Right-Pointing-Pointer We show that toric codes and color codes correspond to homological stabilizer codes on distinct graphs. Black-Right-Pointing-Pointer We find and classify all 2D homological stabilizer codes. Black-Right-Pointing-Pointer We find optimal codes among the homological stabilizer codes.

  12. American Independence. Fifth Grade.

    ERIC Educational Resources Information Center

    Crosby, Annette

    This fifth grade teaching unit covers early conflicts between the American colonies and Britain, battles of the American Revolutionary War, and the Declaration of Independence. Knowledge goals address the pre-revolutionary acts enforced by the British, the concepts of conflict and independence, and the major events and significant people from the…

  13. Fostering Musical Independence

    ERIC Educational Resources Information Center

    Shieh, Eric; Allsup, Randall Everett

    2016-01-01

    Musical independence has always been an essential aim of musical instruction. But this objective can refer to everything from high levels of musical expertise to more student choice in the classroom. While most conceptualizations of musical independence emphasize the demonstration of knowledge and skills within particular music traditions, this…

  14. Centering on Independent Study.

    ERIC Educational Resources Information Center

    Miller, Stephanie

    Independent study is an instructional approach that can have enormous power in the classroom. It can be used successfully with students at all ability levels, even though it is often associated with gifted students. Independent study is an opportunity for students to study a subject of their own choosing under the guidance of a teacher. The…

  15. Energy efficient rateless codes for high speed data transfer over free space optical channels

    NASA Astrophysics Data System (ADS)

    Prakash, Geetha; Kulkarni, Muralidhar; Acharya, U. S.

    2015-03-01

    Terrestrial Free Space Optical (FSO) links transmit information by using the atmosphere (free space) as a medium. In this paper, we have investigated the use of Luby Transform (LT) codes as a means to mitigate the effects of data corruption induced by imperfect channel which usually takes the form of lost or corrupted packets. LT codes, which are a class of Fountain codes, can be used independent of the channel rate and as many code words as required can be generated to recover all the message bits irrespective of the channel performance. Achieving error free high data rates with limited energy resources is possible with FSO systems if error correction codes with minimal overheads on the power can be used. We also employ a combination of Binary Phase Shift Keying (BPSK) with provision for modification of threshold and optimized LT codes with belief propagation for decoding. These techniques provide additional protection even under strong turbulence regimes. Automatic Repeat Request (ARQ) is another method of improving link reliability. Performance of ARQ is limited by the number of retransmissions and the corresponding time delay. We prove through theoretical computations and simulations that LT codes consume less energy per bit. We validate the feasibility of using energy efficient LT codes over ARQ for FSO links to be used in optical wireless sensor networks within the eye safety limits.

  16. The investigation of tethered satellite system dynamics

    NASA Technical Reports Server (NTRS)

    Lorenzini, E. C.

    1986-01-01

    The analysis of the rotational dynamics of the satellite was focused on the rotational amplitude increase of the satellite, with respect to the tether, during retrieval. The dependence of the rotational amplitude upon the tether tension variation to the power 1/4 was thoroughly investigated. The damping of rotational oscillations achievable by reel control was also quantified while an alternative solution that makes use of a lever arm attached with a universal joint to the satellite was proposed. Comparison simulations between the Smithsonian Astrophysical Observatory and the Martin Marietta (MMA) computer code of reteival maneuvers were also carried out. The agreement between the two, completely independent, codes was extremely close, demonstrating the reliability of the models. The slack tether dynamics during reel jams was analytically investigated in order to identify the limits of applicability of the SLACK3 computer code to this particular case. Test runs with SLACK3 were also carried out.

  17. Exceptional error minimization in putative primordial genetic codes

    PubMed Central

    2009-01-01

    Background The standard genetic code is redundant and has a highly non-random structure. Codons for the same amino acids typically differ only by the nucleotide in the third position, whereas similar amino acids are encoded, mostly, by codon series that differ by a single base substitution in the third or the first position. As a result, the code is highly albeit not optimally robust to errors of translation, a property that has been interpreted either as a product of selection directed at the minimization of errors or as a non-adaptive by-product of evolution of the code driven by other forces. Results We investigated the error-minimization properties of putative primordial codes that consisted of 16 supercodons, with the third base being completely redundant, using a previously derived cost function and the error minimization percentage as the measure of a code's robustness to mistranslation. It is shown that, when the 16-supercodon table is populated with 10 putative primordial amino acids, inferred from the results of abiotic synthesis experiments and other evidence independent of the code's evolution, and with minimal assumptions used to assign the remaining supercodons, the resulting 2-letter codes are nearly optimal in terms of the error minimization level. Conclusion The results of the computational experiments with putative primordial genetic codes that contained only two meaningful letters in all codons and encoded 10 to 16 amino acids indicate that such codes are likely to have been nearly optimal with respect to the minimization of translation errors. This near-optimality could be the outcome of extensive early selection during the co-evolution of the code with the primordial, error-prone translation system, or a result of a unique, accidental event. Under this hypothesis, the subsequent expansion of the code resulted in a decrease of the error minimization level that became sustainable owing to the evolution of a high-fidelity translation system

  18. Non-White, No More: Effect Coding as an Alternative to Dummy Coding with Implications for Higher Education Researchers

    ERIC Educational Resources Information Center

    Mayhew, Matthew J.; Simonoff, Jeffrey S.

    2015-01-01

    The purpose of this article is to describe effect coding as an alternative quantitative practice for analyzing and interpreting categorical, race-based independent variables in higher education research. Unlike indicator (dummy) codes that imply that one group will be a reference group, effect codes use average responses as a means for…

  19. Coding of Neuroinfectious Diseases.

    PubMed

    Barkley, Gregory L

    2015-12-01

    Accurate coding is an important function of neurologic practice. This contribution to Continuum is part of an ongoing series that presents helpful coding information along with examples related to the issue topic. Tips for diagnosis coding, Evaluation and Management coding, procedure coding, or a combination are presented, depending on which is most applicable to the subject area of the issue. PMID:26633789

  20. Model Children's Code.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. American Indian Law Center.

    The Model Children's Code was developed to provide a legally correct model code that American Indian tribes can use to enact children's codes that fulfill their legal, cultural and economic needs. Code sections cover the court system, jurisdiction, juvenile offender procedures, minor-in-need-of-care, and termination. Almost every Code section is…

  1. Memory modulates journey-dependent coding in the rat hippocampus

    PubMed Central

    Ferbinteanu, J.; Shirvalkar, P.; Shapiro, M. L.

    2011-01-01

    Neurons in the rat hippocampus signal current location by firing in restricted areas called place fields. During goal-directed tasks in mazes, place fields can also encode past and future positions through journey-dependent activity, which could guide hippocampus-dependent behavior and underlie other temporally extended memories, such as autobiographical recollections. The relevance of journey-dependent activity for hippocampal-dependent memory, however, is not well understood. To further investigate the relationship between hippocampal journey-dependent activity and memory we compared neural firing in rats performing two mnemonically distinct but behaviorally identical tasks in the plus maze: a hippocampus-dependent spatial navigation task, and a hippocampus-independent cue response task. While place, prospective, and retrospective coding reflected temporally extended behavioral episodes in both tasks, memory strategy altered coding differently before and after the choice point. Before the choice point, when discriminative selection of memory strategy was critical, a switch between the tasks elicited a change in a field’s coding category, so that a field that signaled current location in one task coded pending journeys in the other task. After the choice point, however, when memory strategy became irrelevant, the fields preserved coding categories across tasks, so that the same field consistently signaled either current location or the recent journeys. Additionally, on the start arm firing rates were affected at comparable levels by task and journey, while on the goal arm firing rates predominantly encoded journey. The data demonstrate a direct link between journey-dependent coding and memory, and suggest that episodes are encoded by both population and firing rate coding. PMID:21697365

  2. The Proteomic Code: a molecular recognition code for proteins

    PubMed Central

    Biro, Jan C

    2007-01-01

    Background The Proteomic Code is a set of rules by which information in genetic material is transferred into the physico-chemical properties of amino acids. It determines how individual amino acids interact with each other during folding and in specific protein-protein interactions. The Proteomic Code is part of the redundant Genetic Code. Review The 25-year-old history of this concept is reviewed from the first independent suggestions by Biro and Mekler, through the works of Blalock, Root-Bernstein, Siemion, Miller and others, followed by the discovery of a Common Periodic Table of Codons and Nucleic Acids in 2003 and culminating in the recent conceptualization of partial complementary coding of interacting amino acids as well as the theory of the nucleic acid-assisted protein folding. Methods and conclusions A novel cloning method for the design and production of specific, high-affinity-reacting proteins (SHARP) is presented. This method is based on the concept of proteomic codes and is suitable for large-scale, industrial production of specifically interacting peptides. PMID:17999762

  3. Pulsed Inductive Thruster (PIT): Modeling and Validation Using the MACH2 Code

    NASA Technical Reports Server (NTRS)

    Schneider, Steven (Technical Monitor); Mikellides, Pavlos G.

    2003-01-01

    Numerical modeling of the Pulsed Inductive Thruster exercising the magnetohydrodynamics code, MACH2 aims to provide bilateral validation of the thruster's measured performance and the code's capability of capturing the pertinent physical processes. Computed impulse values for helium and argon propellants demonstrate excellent correlation to the experimental data for a range of energy levels and propellant-mass values. The effects of the vacuum tank wall and massinjection scheme were investigated to show trivial changes in the overall performance. An idealized model for these energy levels and propellants deduces that the energy expended to the internal energy modes and plasma dissipation processes is independent of the propellant type, mass, and energy level.

  4. Minimizing correlation effect using zero cross correlation code in spectral amplitude coding optical code division multiple access

    NASA Astrophysics Data System (ADS)

    Safar, Anuar Mat; Aljunid, Syed Alwee; Arief, Amir Razif; Nordin, Junita; Saad, Naufal

    2012-01-01

    The use of minimal multiple access interference (MAI) in code design is investigated. Applying a projection and mapping techniques, a code that has a zero cross correlation (ZCC) between users in optical code division multiple access (OCDMA) is presented in this paper. The system is based on an incoherent light source—LED, spectral amplitude coding (SAC), and direct detection techniques at the receiver. Using power spectral density (PSD) function and Gaussian approximation, we obtain the signal-to-noise ratio (SNR) and the bit-error rate (BER) to measure the code performance. Making a comparison with other existing codes, e.g., Hadamard, MFH and MDW codes, we show that our code performs better at BER 10-9 in terms of number of simultaneous users. We also demonstrate the comparison between the theoretical and simulation analyses, where the results are close to one another.

  5. Data Machine Independence

    Energy Science and Technology Software Center (ESTSC)

    1994-12-30

    Data-machine independence achieved by using four technologies (ASN.1, XDR, SDS, and ZEBRA) has been evaluated by encoding two different applications in each of the above; and their results compared against the standard programming method using C.

  6. Media independent interface

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The work done on the Media Independent Interface (MII) Interface Control Document (ICD) program is described and recommendations based on it were made. Explanations and rationale for the content of the ICD itself are presented.

  7. AEST: Adaptive Eigenvalue Stability Code

    NASA Astrophysics Data System (ADS)

    Zheng, L.-J.; Kotschenreuther, M.; Waelbroeck, F.; van Dam, J. W.; Berk, H.

    2002-11-01

    An adaptive eigenvalue linear stability code is developed. The aim is on one hand to include the non-ideal MHD effects into the global MHD stability calculation for both low and high n modes and on the other hand to resolve the numerical difficulty involving MHD singularity on the rational surfaces at the marginal stability. Our code follows some parts of philosophy of DCON by abandoning relaxation methods based on radial finite element expansion in favor of an efficient shooting procedure with adaptive gridding. The δ W criterion is replaced by the shooting procedure and subsequent matrix eigenvalue problem. Since the technique of expanding a general solution into a summation of the independent solutions employed, the rank of the matrices involved is just a few hundreds. This makes easier to solve the eigenvalue problem with non-ideal MHD effects, such as FLR or even full kinetic effects, as well as plasma rotation effect, taken into account. To include kinetic effects, the approach of solving for the distribution function as a local eigenvalue ω problem as in the GS2 code will be employed in the future. Comparison of the ideal MHD version of the code with DCON, PEST, and GATO will be discussed. The non-ideal MHD version of the code will be employed to study as an application the transport barrier physics in tokamak discharges.

  8. Layered Low-Density Generator Matrix Codes for Super High Definition Scalable Video Coding System

    NASA Astrophysics Data System (ADS)

    Tonomura, Yoshihide; Shirai, Daisuke; Nakachi, Takayuki; Fujii, Tatsuya; Kiya, Hitoshi

    In this paper, we introduce layered low-density generator matrix (Layered-LDGM) codes for super high definition (SHD) scalable video systems. The layered-LDGM codes maintain the correspondence relationship of each layer from the encoder side to the decoder side. This resulting structure supports partial decoding. Furthermore, the proposed layered-LDGM codes create highly efficient forward error correcting (FEC) data by considering the relationship between each scalable component. Therefore, the proposed layered-LDGM codes raise the probability of restoring the important components. Simulations show that the proposed layered-LDGM codes offer better error resiliency than the existing method which creates FEC data for each scalable component independently. The proposed layered-LDGM codes support partial decoding and raise the probability of restoring the base component. These characteristics are very suitable for scalable video coding systems.

  9. Utilizing sequence intrinsic composition to classify protein-coding and long non-coding transcripts.

    PubMed

    Sun, Liang; Luo, Haitao; Bu, Dechao; Zhao, Guoguang; Yu, Kuntao; Zhang, Changhai; Liu, Yuanning; Chen, Runsheng; Zhao, Yi

    2013-09-01

    It is a challenge to classify protein-coding or non-coding transcripts, especially those re-constructed from high-throughput sequencing data of poorly annotated species. This study developed and evaluated a powerful signature tool, Coding-Non-Coding Index (CNCI), by profiling adjoining nucleotide triplets to effectively distinguish protein-coding and non-coding sequences independent of known annotations. CNCI is effective for classifying incomplete transcripts and sense-antisense pairs. The implementation of CNCI offered highly accurate classification of transcripts assembled from whole-transcriptome sequencing data in a cross-species manner, that demonstrated gene evolutionary divergence between vertebrates, and invertebrates, or between plants, and provided a long non-coding RNA catalog of orangutan. CNCI software is available at http://www.bioinfo.org/software/cnci. PMID:23892401

  10. Bitplane Image Coding With Parallel Coefficient Processing.

    PubMed

    Auli-Llinas, Francesc; Enfedaque, Pablo; Moure, Juan C; Sanchez, Victor

    2016-01-01

    Image coding systems have been traditionally tailored for multiple instruction, multiple data (MIMD) computing. In general, they partition the (transformed) image in codeblocks that can be coded in the cores of MIMD-based processors. Each core executes a sequential flow of instructions to process the coefficients in the codeblock, independently and asynchronously from the others cores. Bitplane coding is a common strategy to code such data. Most of its mechanisms require sequential processing of the coefficients. The last years have seen the upraising of processing accelerators with enhanced computational performance and power efficiency whose architecture is mainly based on the single instruction, multiple data (SIMD) principle. SIMD computing refers to the execution of the same instruction to multiple data in a lockstep synchronous way. Unfortunately, current bitplane coding strategies cannot fully profit from such processors due to inherently sequential coding task. This paper presents bitplane image coding with parallel coefficient (BPC-PaCo) processing, a coding method that can process many coefficients within a codeblock in parallel and synchronously. To this end, the scanning order, the context formation, the probability model, and the arithmetic coder of the coding engine have been re-formulated. The experimental results suggest that the penalization in coding performance of BPC-PaCo with respect to the traditional strategies is almost negligible. PMID:26441420

  11. Accumulate repeat accumulate codes

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative channel coding scheme called 'Accumulate Repeat Accumulate codes' (ARA). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes, thus belief propagation can be used for iterative decoding of ARA codes on a graph. The structure of encoder for this class can be viewed as precoded Repeat Accumulate (RA) code or as precoded Irregular Repeat Accumulate (IRA) code, where simply an accumulator is chosen as a precoder. Thus ARA codes have simple, and very fast encoder structure when they representing LDPC codes. Based on density evolution for LDPC codes through some examples for ARA codes, we show that for maximum variable node degree 5 a minimum bit SNR as low as 0.08 dB from channel capacity for rate 1/2 can be achieved as the block size goes to infinity. Thus based on fixed low maximum variable node degree, its threshold outperforms not only the RA and IRA codes but also the best known LDPC codes with the dame maximum node degree. Furthermore by puncturing the accumulators any desired high rate codes close to code rate 1 can be obtained with thresholds that stay close to the channel capacity thresholds uniformly. Iterative decoding simulation results are provided. The ARA codes also have projected graph or protograph representation that allows for high speed decoder implementation.

  12. Random coding strategies for minimum entropy

    NASA Technical Reports Server (NTRS)

    Posner, E. C.

    1975-01-01

    This paper proves that there exists a fixed random coding strategy for block coding a memoryless information source to achieve the absolute epsilon entropy of the source. That is, the strategy can be chosen independent of the block length. The principal new tool is an easy result on the semicontinuity of the relative entropy functional of one probability distribution with respect to another. The theorem generalizes a result from rate-distortion theory to the 'zero-infinity' case.

  13. Concatenated Coding Using Trellis-Coded Modulation

    NASA Technical Reports Server (NTRS)

    Thompson, Michael W.

    1997-01-01

    In the late seventies and early eighties a technique known as Trellis Coded Modulation (TCM) was developed for providing spectrally efficient error correction coding. Instead of adding redundant information in the form of parity bits, redundancy is added at the modulation stage thereby increasing bandwidth efficiency. A digital communications system can be designed to use bandwidth-efficient multilevel/phase modulation such as Amplitude Shift Keying (ASK), Phase Shift Keying (PSK), Differential Phase Shift Keying (DPSK) or Quadrature Amplitude Modulation (QAM). Performance gain can be achieved by increasing the number of signals over the corresponding uncoded system to compensate for the redundancy introduced by the code. A considerable amount of research and development has been devoted toward developing good TCM codes for severely bandlimited applications. More recently, the use of TCM for satellite and deep space communications applications has received increased attention. This report describes the general approach of using a concatenated coding scheme that features TCM and RS coding. Results have indicated that substantial (6-10 dB) performance gains can be achieved with this approach with comparatively little bandwidth expansion. Since all of the bandwidth expansion is due to the RS code we see that TCM based concatenated coding results in roughly 10-50% bandwidth expansion compared to 70-150% expansion for similar concatenated scheme which use convolution code. We stress that combined coding and modulation optimization is important for achieving performance gains while maintaining spectral efficiency.

  14. Coset Codes Viewed as Terminated Convolutional Codes

    NASA Technical Reports Server (NTRS)

    Fossorier, Marc P. C.; Lin, Shu

    1996-01-01

    In this paper, coset codes are considered as terminated convolutional codes. Based on this approach, three new general results are presented. First, it is shown that the iterative squaring construction can equivalently be defined from a convolutional code whose trellis terminates. This convolutional code determines a simple encoder for the coset code considered, and the state and branch labelings of the associated trellis diagram become straightforward. Also, from the generator matrix of the code in its convolutional code form, much information about the trade-off between the state connectivity and complexity at each section, and the parallel structure of the trellis, is directly available. Based on this generator matrix, it is shown that the parallel branches in the trellis diagram of the convolutional code represent the same coset code C(sub 1), of smaller dimension and shorter length. Utilizing this fact, a two-stage optimum trellis decoding method is devised. The first stage decodes C(sub 1), while the second stage decodes the associated convolutional code, using the branch metrics delivered by stage 1. Finally, a bidirectional decoding of each received block starting at both ends is presented. If about the same number of computations is required, this approach remains very attractive from a practical point of view as it roughly doubles the decoding speed. This fact is particularly interesting whenever the second half of the trellis is the mirror image of the first half, since the same decoder can be implemented for both parts.

  15. Coding Strategies and Implementations of Compressive Sensing

    NASA Astrophysics Data System (ADS)

    Tsai, Tsung-Han

    information from a noisy environment. Using engineering efforts to accomplish the same task usually requires multiple detectors, advanced computational algorithms, or artificial intelligence systems. Compressive acoustic sensing incorporates acoustic metamaterials in compressive sensing theory to emulate the abilities of sound localization and selective attention. This research investigates and optimizes the sensing capacity and the spatial sensitivity of the acoustic sensor. The well-modeled acoustic sensor allows localizing multiple speakers in both stationary and dynamic auditory scene; and distinguishing mixed conversations from independent sources with high audio recognition rate.

  16. Discussion on LDPC Codes and Uplink Coding

    NASA Technical Reports Server (NTRS)

    Andrews, Ken; Divsalar, Dariush; Dolinar, Sam; Moision, Bruce; Hamkins, Jon; Pollara, Fabrizio

    2007-01-01

    This slide presentation reviews the progress that the workgroup on Low-Density Parity-Check (LDPC) for space link coding. The workgroup is tasked with developing and recommending new error correcting codes for near-Earth, Lunar, and deep space applications. Included in the presentation is a summary of the technical progress of the workgroup. Charts that show the LDPC decoder sensitivity to symbol scaling errors are reviewed, as well as a chart showing the performance of several frame synchronizer algorithms compared to that of some good codes and LDPC decoder tests at ESTL. Also reviewed is a study on Coding, Modulation, and Link Protocol (CMLP), and the recommended codes. A design for the Pseudo-Randomizer with LDPC Decoder and CRC is also reviewed. A chart that summarizes the three proposed coding systems is also presented.

  17. Transionospheric Propagation Code (TIPC)

    SciTech Connect

    Roussel-Dupre, R.; Kelley, T.A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of vhf signals following propagation through the ionosphere. The code is written in Fortran 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, DTOA study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of delta-times-of-arrival (DTOAs) vs TECs for a specified pair of receivers.

  18. Transionospheric Propagation Code (TIPC)

    NASA Astrophysics Data System (ADS)

    Roussel-Dupre, Robert; Kelley, Thomas A.

    1990-10-01

    The Transionospheric Propagation Code is a computer program developed at Los Alamos National Lab to perform certain tasks related to the detection of VHF signals following propagation through the ionosphere. The code is written in FORTRAN 77, runs interactively and was designed to be as machine independent as possible. A menu format in which the user is prompted to supply appropriate parameters for a given task has been adopted for the input while the output is primarily in the form of graphics. The user has the option of selecting from five basic tasks, namely transionospheric propagation, signal filtering, signal processing, delta times of arrival (DTOA) study, and DTOA uncertainty study. For the first task a specified signal is convolved against the impulse response function of the ionosphere to obtain the transionospheric signal. The user is given a choice of four analytic forms for the input pulse or of supplying a tabular form. The option of adding Gaussian-distributed white noise of spectral noise to the input signal is also provided. The deterministic ionosphere is characterized to first order in terms of a total electron content (TEC) along the propagation path. In addition, a scattering model parameterized in terms of a frequency coherence bandwidth is also available. In the second task, detection is simulated by convolving a given filter response against the transionospheric signal. The user is given a choice of a wideband filter or a narrowband Gaussian filter. It is also possible to input a filter response. The third task provides for quadrature detection, envelope detection, and three different techniques for time-tagging the arrival of the transionospheric signal at specified receivers. The latter algorithms can be used to determine a TEC and thus take out the effects of the ionosphere to first order. Task four allows the user to construct a table of DTOAs vs TECs for a specified pair of receivers.

  19. Manually operated coded switch

    DOEpatents

    Barnette, Jon H.

    1978-01-01

    The disclosure relates to a manually operated recodable coded switch in which a code may be inserted, tried and used to actuate a lever controlling an external device. After attempting a code, the switch's code wheels must be returned to their zero positions before another try is made.

  20. Independent NOAA considered

    NASA Astrophysics Data System (ADS)

    Richman, Barbara T.

    A proposal to pull the National Oceanic and Atmospheric Administration (NOAA) out of the Department of Commerce and make it an independent agency was the subject of a recent congressional hearing. Supporters within the science community and in Congress said that an independent NOAA will benefit by being more visible and by not being tied to a cabinet-level department whose main concerns lie elsewhere. The proposal's critics, however, cautioned that making NOAA independent could make it even more vulnerable to the budget axe and would sever the agency's direct access to the President.The separation of NOAA from Commerce was contained in a June 1 proposal by President Ronald Reagan that also called for all federal trade functions under the Department of Commerce to be reorganized into a new Department of International Trade and Industry (DITI).

  1. Independent technical review, handbook

    SciTech Connect

    Not Available

    1994-02-01

    Purpose Provide an independent engineering review of the major projects being funded by the Department of Energy, Office of Environmental Restoration and Waste Management. The independent engineering review will address questions of whether the engineering practice is sufficiently developed to a point where a major project can be executed without significant technical problems. The independent review will focus on questions related to: (1) Adequacy of development of the technical base of understanding; (2) Status of development and availability of technology among the various alternatives; (3) Status and availability of the industrial infrastructure to support project design, equipment fabrication, facility construction, and process and program/project operation; (4) Adequacy of the design effort to provide a sound foundation to support execution of project; (5) Ability of the organization to fully integrate the system, and direct, manage, and control the execution of a complex major project.

  2. NERO- a post-maximum supernova radiation transport code

    NASA Astrophysics Data System (ADS)

    Maurer, I.; Jerkstrand, A.; Mazzali, P. A.; Taubenberger, S.; Hachinger, S.; Kromer, M.; Sim, S.; Hillebrandt, W.

    2011-12-01

    The interpretation of supernova (SN) spectra is essential for deriving SN ejecta properties such as density and composition, which in turn can tell us about their progenitors and the explosion mechanism. A very large number of atomic processes are important for spectrum formation. Several tools for calculating SN spectra exist, but they mainly focus on the very early or late epochs. The intermediate phase, which requires a non-local thermodynamic equilibrium (NLTE) treatment of radiation transport has rarely been studied. In this paper, we present a new SN radiation transport code, NERO, which can look at those epochs. All the atomic processes are treated in full NLTE, under a steady-state assumption. This is a valid approach between roughly 50 and 500 days after the explosion depending on SN type. This covers the post-maximum photospheric and the early and the intermediate nebular phase. As a test, we compare NERO to the radiation transport code of Jerkstrand, Fransson & Kozma and to the nebular code of Mazzali et al. All three codes have been developed independently and a comparison provides a valuable opportunity to investigate their reliability. Currently, NERO is one-dimensional and can be used for predicting spectra of synthetic explosion models or for deriving SN properties by spectral modelling. To demonstrate this, we study the spectra of the 'normal' Type Ia supernova (SN Ia) 2005cf between 50 and 350 days after the explosion and identify most of the common SN Ia line features at post-maximum epochs.

  3. Code-Switching: L1-Coded Mediation in a Kindergarten Foreign Language Classroom

    ERIC Educational Resources Information Center

    Lin, Zheng

    2012-01-01

    This paper is based on a qualitative inquiry that investigated the role of teachers' mediation in three different modes of coding in a kindergarten foreign language classroom in China (i.e. L2-coded intralinguistic mediation, L1-coded cross-lingual mediation, and L2-and-L1-mixed mediation). Through an exploratory examination of the varying effects…

  4. Algebraic geometric codes

    NASA Technical Reports Server (NTRS)

    Shahshahani, M.

    1991-01-01

    The performance characteristics are discussed of certain algebraic geometric codes. Algebraic geometric codes have good minimum distance properties. On many channels they outperform other comparable block codes; therefore, one would expect them eventually to replace some of the block codes used in communications systems. It is suggested that it is unlikely that they will become useful substitutes for the Reed-Solomon codes used by the Deep Space Network in the near future. However, they may be applicable to systems where the signal to noise ratio is sufficiently high so that block codes would be more suitable than convolutional or concatenated codes.

  5. Supporting independent inventors

    SciTech Connect

    Bernard, M.J. III; Whalley, P.; Loyola Univ., Chicago, IL . Dept. of Sociology)

    1989-01-01

    Independent inventors contribute products to the marketplace despite the well-financed brain trusts at corporate, university, and federal R and D laboratories. But given the environment in which the basement/garage inventor labors, transferring a worthwhile invention into a commercial product is quite difficult. There is a growing effort by many state and local agencies and organizations to improve the inventor's working environment and begin to routinize the process of developing ideas and inventional of independent inventors into commercial products. 4 refs.

  6. ARA type protograph codes

    NASA Technical Reports Server (NTRS)

    Divsalar, Dariush (Inventor); Abbasfar, Aliazam (Inventor); Jones, Christopher R. (Inventor); Dolinar, Samuel J. (Inventor); Thorpe, Jeremy C. (Inventor); Andrews, Kenneth S. (Inventor); Yao, Kung (Inventor)

    2008-01-01

    An apparatus and method for encoding low-density parity check codes. Together with a repeater, an interleaver and an accumulator, the apparatus comprises a precoder, thus forming accumulate-repeat-accumulate (ARA codes). Protographs representing various types of ARA codes, including AR3A, AR4A and ARJA codes, are described. High performance is obtained when compared to the performance of current repeat-accumulate (RA) or irregular-repeat-accumulate (IRA) codes.

  7. QR Codes 101

    ERIC Educational Resources Information Center

    Crompton, Helen; LaFrance, Jason; van 't Hooft, Mark

    2012-01-01

    A QR (quick-response) code is a two-dimensional scannable code, similar in function to a traditional bar code that one might find on a product at the supermarket. The main difference between the two is that, while a traditional bar code can hold a maximum of only 20 digits, a QR code can hold up to 7,089 characters, so it can contain much more…

  8. Groundwater flow code verification ``benchmarking`` activity (COVE-2A): Analysis of participants` work

    SciTech Connect

    Dykhuizen, R.C.; Barnard, R.W.

    1992-02-01

    The Nuclear Waste Repository Technology Department at Sandia National Laboratories (SNL) is investigating the suitability of Yucca Mountain as a potential site for underground burial of nuclear wastes. One element of the investigations is to assess the potential long-term effects of groundwater flow on the integrity of a potential repository. A number of computer codes are being used to model groundwater flow through geologic media in which the potential repository would be located. These codes compute numerical solutions for problems that are usually analytically intractable. Consequently, independent confirmation of the correctness of the solution is often not possible. Code verification is a process that permits the determination of the numerical accuracy of codes by comparing the results of several numerical solutions for the same problem. The international nuclear waste research community uses benchmarking for intercomparisons that partially satisfy the Nuclear Regulatory Commission (NRC) definition of code verification. This report presents the results from the COVE-2A (Code Verification) project, which is a subset of the COVE project.

  9. Certifying Auto-Generated Flight Code

    NASA Technical Reports Server (NTRS)

    Denney, Ewen

    2008-01-01

    Model-based design and automated code generation are being used increasingly at NASA. Many NASA projects now use MathWorks Simulink and Real-Time Workshop for at least some of their modeling and code development. However, there are substantial obstacles to more widespread adoption of code generators in safety-critical domains. Since code generators are typically not qualified, there is no guarantee that their output is correct, and consequently the generated code still needs to be fully tested and certified. Moreover, the regeneration of code can require complete recertification, which offsets many of the advantages of using a generator. Indeed, manual review of autocode can be more challenging than for hand-written code. Since the direct V&V of code generators is too laborious and complicated due to their complex (and often proprietary) nature, we have developed a generator plug-in to support the certification of the auto-generated code. Specifically, the AutoCert tool supports certification by formally verifying that the generated code is free of different safety violations, by constructing an independently verifiable certificate, and by explaining its analysis in a textual form suitable for code reviews. The generated documentation also contains substantial tracing information, allowing users to trace between model, code, documentation, and V&V artifacts. This enables missions to obtain assurance about the safety and reliability of the code without excessive manual V&V effort and, as a consequence, eases the acceptance of code generators in safety-critical contexts. The generation of explicit certificates and textual reports is particularly well-suited to supporting independent V&V. The primary contribution of this approach is the combination of human-friendly documentation with formal analysis. The key technical idea is to exploit the idiomatic nature of auto-generated code in order to automatically infer logical annotations. The annotation inference algorithm

  10. Postcard from Independence, Mo.

    ERIC Educational Resources Information Center

    Archer, Jeff

    2004-01-01

    This article reports results showing that the Independence, Missori school district failed to meet almost every one of its improvement goals under the No Child Left Behind Act. The state accreditation system stresses improvement over past scores, while the federal law demands specified amounts of annual progress toward the ultimate goal of 100…

  11. Touchstones of Independence.

    ERIC Educational Resources Information Center

    Roha, Thomas Arden

    1999-01-01

    Foundations affiliated with public higher education institutions can avoid having to open records for public scrutiny, by having independent boards of directors, occupying leased office space or paying market value for university space, using only foundation personnel, retaining legal counsel, being forthcoming with information and use of public…

  12. Independent Human Studies.

    ERIC Educational Resources Information Center

    Kaplan, Suzanne; Wilson, Gordon

    1978-01-01

    The Independent Human Studies program at Schoolcraft College offers an alternative method of earning academic credits. Students delineate an area of study, pose research questions, gather resources, synthesize the information, state the thesis, choose the method of presentation, set schedules, and take responsibility for meeting deadlines. (MB)

  13. Independence and Survival.

    ERIC Educational Resources Information Center

    James, H. Thomas

    Independent schools that are of viable size, well managed, and strategically located to meet competition will survive and prosper past the current financial crisis. We live in a complex technological society with insatiable demands for knowledgeable people to keep it running. The future will be marked by the orderly selection of qualified people,…

  14. Caring about Independent Lives

    ERIC Educational Resources Information Center

    Christensen, Karen

    2010-01-01

    With the rhetoric of independence, new cash for care systems were introduced in many developed welfare states at the end of the 20th century. These systems allow local authorities to pay people who are eligible for community care services directly, to enable them to employ their own careworkers. Despite the obvious importance of the careworker's…

  15. Independence, Disengagement, and Discipline

    ERIC Educational Resources Information Center

    Rubin, Ron

    2012-01-01

    School disengagement is linked to a lack of opportunities for students to fulfill their needs for independence and self-determination. Young people have little say about what, when, where, and how they will learn, the criteria used to assess their success, and the content of school and classroom rules. Traditional behavior management discourages…

  16. Molecular cloning of canine co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) and investigation of its ability to suppress androgen receptor signalling in androgen-independent prostate cancer.

    PubMed

    Kato, Yuiko; Ochiai, Kazuhiko; Michishita, Masaki; Azakami, Daigo; Nakahira, Rei; Morimatsu, Masami; Ishiguro-Oonuma, Toshina; Yoshikawa, Yasunaga; Kobayashi, Masato; Bonkobara, Makoto; Kobayashi, Masanori; Takahashi, Kimimasa; Watanabe, Masami; Omi, Toshinori

    2015-11-01

    Although the morbidity of canine prostate cancer is low, the majority of cases present with resistance to androgen therapy and poor clinical outcomes. These pathological conditions are similar to the signs of the terminal stage of human androgen-independent prostate cancer. The co-chaperone small glutamine-rich tetratricopeptide repeat-containing protein α (SGTA) is known to be overexpressed in human androgen-independent prostate cancer. However, there is little information about the structure and function of canine SGTA. In this study, canine SGTA was cloned and analysed for its ability to suppress androgen receptor signalling. The full-length open reading frame (ORF) of the canine SGTA gene was amplified by RT-PCR using primers designed from canine-expressed sequence tags that were homologous to human SGTA. The canine SGTA ORF has high homology with the corresponding human (89%) and mouse (81%) sequences. SGTA dimerisation region and tetratricopeptide repeat (TPR) domains are conserved across the three species. The ability of canine SGTA to undergo homodimerisation was demonstrated by a mammalian two-hybrid system and a pull-down assay. The negative impact of canine SGTA on androgen receptor (AR) signalling was demonstrated using a reporter assay in androgen-independent human prostate cancer cell lines. Pathological analysis showed overexpression of SGTA in canine prostate cancer, but not in hyperplasia. A reporter assay in prostate cells demonstrated suppression of AR signalling by canine SGTA. Altogether, these results suggest that canine SGTA may play an important role in the acquisition of androgen independence by canine prostate cancer cells. PMID:26346258

  17. Reusable State Machine Code Generator

    NASA Astrophysics Data System (ADS)

    Hoffstadt, A. A.; Reyes, C.; Sommer, H.; Andolfato, L.

    2010-12-01

    The State Machine model is frequently used to represent the behaviour of a system, allowing one to express and execute this behaviour in a deterministic way. A graphical representation such as a UML State Chart diagram tames the complexity of the system, thus facilitating changes to the model and communication between developers and domain experts. We present a reusable state machine code generator, developed by the Universidad Técnica Federico Santa María and the European Southern Observatory. The generator itself is based on the open source project architecture, and uses UML State Chart models as input. This allows for a modular design and a clean separation between generator and generated code. The generated state machine code has well-defined interfaces that are independent of the implementation artefacts such as the middle-ware. This allows using the generator in the substantially different observatory software of the Atacama Large Millimeter Array and the ESO Very Large Telescope. A project-specific mapping layer for event and transition notification connects the state machine code to its environment, which can be the Common Software of these projects, or any other project. This approach even allows to automatically create tests for a generated state machine, using techniques from software testing, such as path-coverage.

  18. Coded continuous wave meteor radar

    NASA Astrophysics Data System (ADS)

    Vierinen, Juha; Chau, Jorge L.; Pfeffer, Nico; Clahsen, Matthias; Stober, Gunter

    2016-03-01

    The concept of a coded continuous wave specular meteor radar (SMR) is described. The radar uses a continuously transmitted pseudorandom phase-modulated waveform, which has several advantages compared to conventional pulsed SMRs. The coding avoids range and Doppler aliasing, which are in some cases problematic with pulsed radars. Continuous transmissions maximize pulse compression gain, allowing operation at lower peak power than a pulsed system. With continuous coding, the temporal and spectral resolution are not dependent on the transmit waveform and they can be fairly flexibly changed after performing a measurement. The low signal-to-noise ratio before pulse compression, combined with independent pseudorandom transmit waveforms, allows multiple geographically separated transmitters to be used in the same frequency band simultaneously without significantly interfering with each other. Because the same frequency band can be used by multiple transmitters, the same interferometric receiver antennas can be used to receive multiple transmitters at the same time. The principles of the signal processing are discussed, in addition to discussion of several practical ways to increase computation speed, and how to optimally detect meteor echoes. Measurements from a campaign performed with a coded continuous wave SMR are shown and compared with two standard pulsed SMR measurements. The type of meteor radar described in this paper would be suited for use in a large-scale multi-static network of meteor radar transmitters and receivers. Such a system would be useful for increasing the number of meteor detections to obtain improved meteor radar data products.

  19. Coding for spread-spectrum communications networks

    NASA Astrophysics Data System (ADS)

    Kim, Bal G.

    1987-03-01

    The multiple-access capability of a frequency-hopp packet radio network is investigated from a coding point of view. The achievable region of code rate and channel traffic and the normalized throughput are considered as performance measures. We model the communication system from the modulator input to the demodulator output as an I-user interference channel, and evaluate the asymptotic performance of various coding schemes for channels with perfect side information, no side information, and imperfect side information. The coding schemes being considered are Reed-Solomon codes, concatenated codes, and parallel decoding schemes. We derive the optimal code rate and the optimal channel traffic at which the normalized throughput is maximized, and from these optimum values the asymptotic maximum normalized throughput is derived. The results are then compared with channel capacities.

  20. Decoder for 3-D color codes

    NASA Astrophysics Data System (ADS)

    Hsu, Kung-Chuan; Brun, Todd

    Transversal circuits are important components of fault-tolerant quantum computation. Several classes of quantum error-correcting codes are known to have transversal implementations of any logical Clifford operation. However, to achieve universal quantum computation, it would be helpful to have high-performance error-correcting codes that have a transversal implementation of some logical non-Clifford operation. The 3-D color codes are a class of topological codes that permit transversal implementation of the logical π / 8 -gate. The decoding problem of a 3-D color code can be understood as a graph-matching problem on a three-dimensional lattice. Whether this class of codes will be useful in terms of performance is still an open question. We investigate the decoding problem of 3-D color codes and analyze the performance of some possible decoders.

  1. Asymmetric quantum convolutional codes

    NASA Astrophysics Data System (ADS)

    La Guardia, Giuliano G.

    2016-01-01

    In this paper, we construct the first families of asymmetric quantum convolutional codes (AQCCs). These new AQCCs are constructed by means of the CSS-type construction applied to suitable families of classical convolutional codes, which are also constructed here. The new codes have non-catastrophic generator matrices, and they have great asymmetry. Since our constructions are performed algebraically, i.e. we develop general algebraic methods and properties to perform the constructions, it is possible to derive several families of such codes and not only codes with specific parameters. Additionally, several different types of such codes are obtained.

  2. The independent medical examination.

    PubMed

    Ameis, Arthur; Zasler, Nathan D

    2002-05-01

    The physiatrist, owing to expertise in impairment and disability analysis, is able to offer the medicolegal process considerable assistance. This chapter describes the scope and process of the independent medical examination (IME) and provides an overview of its component parts. Practical guidelines are provided for performing a physiatric IME of professional standard, and for serving as an impartial, expert witness. Caveats are described regarding testifying and medicolegal ethical issues along with practice management advice. PMID:12122847

  3. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2014-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  4. Reviewing the Challenges and Opportunities Presented by Code Switching and Mixing in Bangla

    ERIC Educational Resources Information Center

    Hasan, Md. Kamrul; Akhand, Mohd. Moniruzzaman

    2015-01-01

    This paper investigates the issues related to code-switching/code-mixing in an ESL context. Some preliminary data on Bangla-English code-switching/code-mixing has been analyzed in order to determine which structural pattern of code-switching/code-mixing is predominant in different social strata. This study also explores the relationship of…

  5. Agent independent task planning

    NASA Technical Reports Server (NTRS)

    Davis, William S.

    1990-01-01

    Agent-Independent Planning is a technique that allows the construction of activity plans without regard to the agent that will perform them. Once generated, a plan is then validated and translated into instructions for a particular agent, whether a robot, crewmember, or software-based control system. Because Space Station Freedom (SSF) is planned for orbital operations for approximately thirty years, it will almost certainly experience numerous enhancements and upgrades, including upgrades in robotic manipulators. Agent-Independent Planning provides the capability to construct plans for SSF operations, independent of specific robotic systems, by combining techniques of object oriented modeling, nonlinear planning and temporal logic. Since a plan is validated using the physical and functional models of a particular agent, new robotic systems can be developed and integrated with existing operations in a robust manner. This technique also provides the capability to generate plans for crewmembers with varying skill levels, and later apply these same plans to more sophisticated robotic manipulators made available by evolutions in technology.

  6. Hardware independence checkout software

    NASA Technical Reports Server (NTRS)

    Cameron, Barry W.; Helbig, H. R.

    1990-01-01

    ACSI has developed a program utilizing CLIPS to assess compliance with various programming standards. Essentially the program parses C code to extract the names of all function calls. These are asserted as CLIPS facts which also include information about line numbers, source file names, and called functions. Rules have been devised to establish functions called that have not been defined in any of the source parsed. These are compared against lists of standards (represented as facts) using rules that check intersections and/or unions of these. By piping the output into other processes the source is appropriately commented by generating and executing parsed scripts.

  7. Independence among People with Disabilities: II. Personal Independence Profile.

    ERIC Educational Resources Information Center

    Nosek, Margaret A.; And Others

    1992-01-01

    Developed Personal Independence Profile (PIP) as an instrument to measure aspects of independence beyond physical and cognitive functioning in people with diverse disabilities. PIP was tested for reliability and validity with 185 subjects from 10 independent living centers. Findings suggest that the Personal Independence Profile measures the…

  8. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-01-01

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  9. Cellulases and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2001-02-20

    The present invention provides three fungal cellulases, their coding sequences, recombinant DNA molecules comprising the cellulase coding sequences, recombinant host cells and methods for producing same. The present cellulases are from Orpinomyces PC-2.

  10. Multiple Turbo Codes

    NASA Technical Reports Server (NTRS)

    Divsalar, D.; Pollara, F.

    1995-01-01

    A description is given of multiple turbo codes and a suitable decoder structure derived from an approximation to the maximum a posteriori probability (MAP) decision rule, which is substantially different from the decoder for two-code-based encoders.

  11. QR Code Mania!

    ERIC Educational Resources Information Center

    Shumack, Kellie A.; Reilly, Erin; Chamberlain, Nik

    2013-01-01

    space, has error-correction capacity, and can be read from any direction. These codes are used in manufacturing, shipping, and marketing, as well as in education. QR codes can be created to produce…

  12. New binary quantum stabilizer codes from the binary extremal self-dual code

    NASA Astrophysics Data System (ADS)

    Wang, WeiLiang; Fan, YangYu; Li, RuiHu

    2015-08-01

    This paper is devoted to constructing binary quantum stabilizer codes based on the binary extremal self-dual code of parameters by Steane's construction. First, we provide an explicit generator matrix for the unique self-dual code to see it as a one-generator quasi-cyclic one and obtain six optimal self-orthogonal codes of parameters for with dual distances from 11 to 7 by puncturing the code. Second, a special type of subcode structures for self-orthogonal codes is investigated, and then ten derived dual chains are designed. Third, twelve binary quantum codes are constructed from these derived dual pairs within dual chains using Steane's construction. Ten of them, , , and , achieve as good parameters as the best known ones with comparable lengths and dimensions. Two other codes of parameters and are record breaking in the sense that they improve on the best known ones with the same lengths and dimensions in terms of distance.

  13. STEEP32 computer code

    NASA Technical Reports Server (NTRS)

    Goerke, W. S.

    1972-01-01

    A manual is presented as an aid in using the STEEP32 code. The code is the EXEC 8 version of the STEEP code (STEEP is an acronym for shock two-dimensional Eulerian elastic plastic). The major steps in a STEEP32 run are illustrated in a sample problem. There is a detailed discussion of the internal organization of the code, including a description of each subroutine.

  14. [Evolutionary deviations from the universal genetic code in ciliates].

    PubMed

    Lukashenko, N P

    2009-04-01

    The review surveys the information, including the most recent data, on the evolution of genetic code in ciliates, which is among the few codes deviating from the universal one. We discuss the cases of recurrent, independently arising deviations from the assignments of standard codons of polypeptide chain termination in the mitochondrial and nuclear genomes of ciliates and some other protozoans. Possible molecular mechanisms are considered, which underlie deviations from standard termination code to coding glutamine (codon UAA and UAG) and cystein or tryptophane (codon UAG) in the nuclear genome. Critical analysis of the main hypotheses on the evolution of secondary deviations from the universal code in ciliates is presented. PMID:19507697

  15. Progress in The Semantic Analysis of Scientific Code

    NASA Technical Reports Server (NTRS)

    Stewart, Mark

    2000-01-01

    This paper concerns a procedure that analyzes aspects of the meaning or semantics of scientific and engineering code. This procedure involves taking a user's existing code, adding semantic declarations for some primitive variables, and parsing this annotated code using multiple, independent expert parsers. These semantic parsers encode domain knowledge and recognize formulae in different disciplines including physics, numerical methods, mathematics, and geometry. The parsers will automatically recognize and document some static, semantic concepts and help locate some program semantic errors. These techniques may apply to a wider range of scientific codes. If so, the techniques could reduce the time, risk, and effort required to develop and modify scientific codes.

  16. Color code identification in coded structured light.

    PubMed

    Zhang, Xu; Li, Youfu; Zhu, Limin

    2012-08-01

    Color code is widely employed in coded structured light to reconstruct the three-dimensional shape of objects. Before determining the correspondence, a very important step is to identify the color code. Until now, the lack of an effective evaluation standard has hindered the progress in this unsupervised classification. In this paper, we propose a framework based on the benchmark to explore the new frontier. Two basic facets of the color code identification are discussed, including color feature selection and clustering algorithm design. First, we adopt analysis methods to evaluate the performance of different color features, and the order of these color features in the discriminating power is concluded after a large number of experiments. Second, in order to overcome the drawback of K-means, a decision-directed method is introduced to find the initial centroids. Quantitative comparisons affirm that our method is robust with high accuracy, and it can find or closely approach the global peak. PMID:22859022

  17. Independent component analysis of parameterized ECG signals.

    PubMed

    Tanskanen, Jarno M A; Viik, Jari J; Hyttinen, Jari A K

    2006-01-01

    Independent component analysis (ICA) of measured signals yields the independent sources, given certain fulfilled requirements. Properly parameterized signals provide a better view to the considered system aspects, while reducing the amount of data. It is little acknowledged that appropriately parameterized signals may be subjected to ICA, yielding independent components (ICs) displaying more clearly the investigated properties of the sources. In this paper, we propose ICA of parameterized signals, and demonstrate the concept with ICA of ST and R parameterizations of electrocardiogram (ECG) signals from ECG exercise test measurements from two coronary artery disease (CAD) patients. PMID:17945912

  18. An independent hydrogen source

    SciTech Connect

    Kobzenko, G.F.; Chubenko, M.V.; Kobzenko, N.S.; Senkevich, A.I.; Shkola, A.A.

    1985-10-01

    Descriptions are given of the design and operation of an independent hydrogen source used in purifying and storing hydrogen. If LaNi/sub 5/ or TiFe is used as the sorbent, one can store about 500 liter of chemically bound hydrogen in a vessel of 0.9 liter. Molecular purification of the desorbed hydrogen is used. The IHS is a safe hydrogen source, since the hydrogen is trapped in the sorbent in the chemically bound state and in equilibrium with LaNi/sub 5/Hx at room temperature. If necessary, the IHS can serve as a compressor and provide higher hydrogen pressures. The device is compact and transportable.

  19. Employee vs independent contractor.

    PubMed

    Kolender, Ellen

    2012-01-01

    Finding qualified personnel for the cancer registry department has become increasingly difficult, as experienced abstractors retire and cancer diagnoses increase. Faced with hiring challenges, managers turn to teleworkers to fill positions and accomplish work in a timely manner. Suddenly, the hospital hires new legal staff and all telework agreements are disrupted. The question arises: Are teleworkers employees or independent contractors? Creating telework positions requires approval from the legal department and human resources. Caught off-guard in the last quarter of the year, I found myself again faced with hiring challenges. PMID:23599033

  20. Software Certification - Coding, Code, and Coders

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Holzmann, Gerard J.

    2011-01-01

    We describe a certification approach for software development that has been adopted at our organization. JPL develops robotic spacecraft for the exploration of the solar system. The flight software that controls these spacecraft is considered to be mission critical. We argue that the goal of a software certification process cannot be the development of "perfect" software, i.e., software that can be formally proven to be correct under all imaginable and unimaginable circumstances. More realistically, the goal is to guarantee a software development process that is conducted by knowledgeable engineers, who follow generally accepted procedures to control known risks, while meeting agreed upon standards of workmanship. We target three specific issues that must be addressed in such a certification procedure: the coding process, the code that is developed, and the skills of the coders. The coding process is driven by standards (e.g., a coding standard) and tools. The code is mechanically checked against the standard with the help of state-of-the-art static source code analyzers. The coders, finally, are certified in on-site training courses that include formal exams.

  1. Cary Potter on Independent Education

    ERIC Educational Resources Information Center

    Potter, Cary

    1978-01-01

    Cary Potter was President of the National Association of Independent Schools from 1964-1978. As he leaves NAIS he gives his views on education, on independence, on the independent school, on public responsibility, on choice in a free society, on educational change, and on the need for collective action by independent schools. (Author/RK)

  2. Myth or Truth: Independence Day.

    ERIC Educational Resources Information Center

    Gardner, Traci

    Most Americans think of the Fourth of July as Independence Day, but is it really the day the U.S. declared and celebrated independence? By exploring myths and truths surrounding Independence Day, this lesson asks students to think critically about commonly believed stories regarding the beginning of the Revolutionary War and the Independence Day…

  3. Applications of Coding in Network Communications

    ERIC Educational Resources Information Center

    Chang, Christopher SungWook

    2012-01-01

    This thesis uses the tool of network coding to investigate fast peer-to-peer file distribution, anonymous communication, robust network construction under uncertainty, and prioritized transmission. In a peer-to-peer file distribution system, we use a linear optimization approach to show that the network coding framework significantly simplifies…

  4. Source Code Plagiarism--A Student Perspective

    ERIC Educational Resources Information Center

    Joy, M.; Cosma, G.; Yau, J. Y.-K.; Sinclair, J.

    2011-01-01

    This paper considers the problem of source code plagiarism by students within the computing disciplines and reports the results of a survey of students in Computing departments in 18 institutions in the U.K. This survey was designed to investigate how well students understand the concept of source code plagiarism and to discover what, if any,…

  5. Description of ground motion data processing codes: Volume 3

    SciTech Connect

    Sanders, M.L.

    1988-02-01

    Data processing codes developed to process ground motion at the Nevada Test Site for the Weapons Test Seismic Investigations Project are used today as part of the program to process ground motion records for the Nevada Nuclear Waste Storage Investigations Project. The work contained in this report documents and lists codes and verifies the ``PSRV`` code. 39 figs.

  6. Independent task Fourier filters

    NASA Astrophysics Data System (ADS)

    Caulfield, H. John

    2001-11-01

    Since the early 1960s, a major part of optical computing systems has been Fourier pattern recognition, which takes advantage of high speed filter changes to enable powerful nonlinear discrimination in `real time.' Because filter has a task quite independent of the tasks of the other filters, they can be applied and evaluated in parallel or, in a simple approach I describe, in sequence very rapidly. Thus I use the name ITFF (independent task Fourier filter). These filters can also break very complex discrimination tasks into easily handled parts, so the wonderful space invariance properties of Fourier filtering need not be sacrificed to achieve high discrimination and good generalizability even for ultracomplex discrimination problems. The training procedure proceeds sequentially, as the task for a given filter is defined a posteriori by declaring it to be the discrimination of particular members of set A from all members of set B with sufficient margin. That is, we set the threshold to achieve the desired margin and note the A members discriminated by that threshold. Discriminating those A members from all members of B becomes the task of that filter. Those A members are then removed from the set A, so no other filter will be asked to perform that already accomplished task.

  7. XSOR codes users manual

    SciTech Connect

    Jow, Hong-Nian; Murfin, W.B.; Johnson, J.D.

    1993-11-01

    This report describes the source term estimation codes, XSORs. The codes are written for three pressurized water reactors (Surry, Sequoyah, and Zion) and two boiling water reactors (Peach Bottom and Grand Gulf). The ensemble of codes has been named ``XSOR``. The purpose of XSOR codes is to estimate the source terms which would be released to the atmosphere in severe accidents. A source term includes the release fractions of several radionuclide groups, the timing and duration of releases, the rates of energy release, and the elevation of releases. The codes have been developed by Sandia National Laboratories for the US Nuclear Regulatory Commission (NRC) in support of the NUREG-1150 program. The XSOR codes are fast running parametric codes and are used as surrogates for detailed mechanistic codes. The XSOR codes also provide the capability to explore the phenomena and their uncertainty which are not currently modeled by the mechanistic codes. The uncertainty distributions of input parameters may be used by an. XSOR code to estimate the uncertainty of source terms.

  8. High-speed Viterbi decoding with overlapping code sequences

    NASA Technical Reports Server (NTRS)

    Ross, Michael D.; Osborne, William P.

    1993-01-01

    The Viterbi Algorithm for decoding convolutional codes and Trellis Coded Modulation is suited to VLSI implementation but contains a feedback loop which limits the speed of pipelined architecture. The feedback loop is circumvented by decoding independent sequences simultaneously, resulting in a 5-9 fold speed-up with a two-fold hardware expansion.

  9. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  10. Astronomy education and the Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, Alice; Nemiroff, Robert J.

    2016-01-01

    The Astrophysics Source Code Library (ASCL) is an online registry of source codes used in refereed astrophysics research. It currently lists nearly 1,200 codes and covers all aspects of computational astrophysics. How can this resource be of use to educators and to the graduate students they mentor? The ASCL serves as a discovery tool for codes that can be used for one's own research. Graduate students can also investigate existing codes to see how common astronomical problems are approached numerically in practice, and use these codes as benchmarks for their own solutions to these problems. Further, they can deepen their knowledge of software practices and techniques through examination of others' codes.

  11. DLLExternalCode

    SciTech Connect

    Greg Flach, Frank Smith

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read from files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.

  12. DLLExternalCode

    Energy Science and Technology Software Center (ESTSC)

    2014-05-14

    DLLExternalCode is the a general dynamic-link library (DLL) interface for linking GoldSim (www.goldsim.com) with external codes. The overall concept is to use GoldSim as top level modeling software with interfaces to external codes for specific calculations. The DLLExternalCode DLL that performs the linking function is designed to take a list of code inputs from GoldSim, create an input file for the external application, run the external code, and return a list of outputs, read frommore » files created by the external application, back to GoldSim. Instructions for creating the input file, running the external code, and reading the output are contained in an instructions file that is read and interpreted by the DLL.« less

  13. Bit allocation for joint coding of multiple video programs

    NASA Astrophysics Data System (ADS)

    Wang, Limin; Vincent, Andre

    1997-01-01

    By dynamically distributing the channel capacity among video programs according to their respective scene complexities, joint coding has been a shown to be more efficient than independent coding for compression of multiple video programs. This paper examines the bit allocation issue for joint coding of multiple video programs and provides a bit allocation strategy that results in uniform picture quality among programs as will as within a program.

  14. Frame independent cosmological perturbations

    SciTech Connect

    Prokopec, Tomislav; Weenink, Jan E-mail: j.g.weenink@uu.nl

    2013-09-01

    We compute the third order gauge invariant action for scalar-graviton interactions in the Jordan frame. We demonstrate that the gauge invariant action for scalar and tensor perturbations on one physical hypersurface only differs from that on another physical hypersurface via terms proportional to the equation of motion and boundary terms, such that the evolution of non-Gaussianity may be called unique. Moreover, we demonstrate that the gauge invariant curvature perturbation and graviton on uniform field hypersurfaces in the Jordan frame are equal to their counterparts in the Einstein frame. These frame independent perturbations are therefore particularly useful in relating results in different frames at the perturbative level. On the other hand, the field perturbation and graviton on uniform curvature hypersurfaces in the Jordan and Einstein frame are non-linearly related, as are their corresponding actions and n-point functions.

  15. cncRNAs: Bi-functional RNAs with protein coding and non-coding functions

    PubMed Central

    Kumari, Pooja; Sampath, Karuna

    2015-01-01

    For many decades, the major function of mRNA was thought to be to provide protein-coding information embedded in the genome. The advent of high-throughput sequencing has led to the discovery of pervasive transcription of eukaryotic genomes and opened the world of RNA-mediated gene regulation. Many regulatory RNAs have been found to be incapable of protein coding and are hence termed as non-coding RNAs (ncRNAs). However, studies in recent years have shown that several previously annotated non-coding RNAs have the potential to encode proteins, and conversely, some coding RNAs have regulatory functions independent of the protein they encode. Such bi-functional RNAs, with both protein coding and non-coding functions, which we term as ‘cncRNAs’, have emerged as new players in cellular systems. Here, we describe the functions of some cncRNAs identified from bacteria to humans. Because the functions of many RNAs across genomes remains unclear, we propose that RNAs be classified as coding, non-coding or both only after careful analysis of their functions. PMID:26498036

  16. [Quality of coding in acute inpatient care].

    PubMed

    Stausberg, J

    2007-08-01

    Routine data in the electronic patient record are frequently used for secondary purposes. Core elements of the electronic patient record are diagnoses and procedures, coded with the mandatory classifications. Despite the important role of routine data for reimbursement, quality management and health care statistics, there is currently no systematic analysis of coding quality in Germany. Respective concepts and investigations share the difficulty to decide what's right and what's wrong, being at the end of the long process of medical decision making. Therefore, a relevant amount of disagreement has to be accepted. In case of the principal diagnosis, this could be the fact in half of the patients. Plausibility of coding looks much better. After optimization time in hospitals, regular and complete coding can be expected. Whether coding matches reality, as a prerequisite for further use of the data in medicine and health politics, should be investigated in controlled trials in the future. PMID:17676418

  17. Adaptive entropy coded subband coding of images.

    PubMed

    Kim, Y H; Modestino, J W

    1992-01-01

    The authors describe a design approach, called 2-D entropy-constrained subband coding (ECSBC), based upon recently developed 2-D entropy-constrained vector quantization (ECVQ) schemes. The output indexes of the embedded quantizers are further compressed by use of noiseless entropy coding schemes, such as Huffman or arithmetic codes, resulting in variable-rate outputs. Depending upon the specific configurations of the ECVQ and the ECPVQ over the subbands, many different types of SBC schemes can be derived within the generic 2-D ECSBC framework. Among these, the authors concentrate on three representative types of 2-D ECSBC schemes and provide relative performance evaluations. They also describe an adaptive buffer instrumented version of 2-D ECSBC, called 2-D ECSBC/AEC, for use with fixed-rate channels which completely eliminates buffer overflow/underflow problems. This adaptive scheme achieves performance quite close to the corresponding ideal 2-D ECSBC system. PMID:18296138

  18. Generating code adapted for interlinking legacy scalar code and extended vector code

    DOEpatents

    Gschwind, Michael K

    2013-06-04

    Mechanisms for intermixing code are provided. Source code is received for compilation using an extended Application Binary Interface (ABI) that extends a legacy ABI and uses a different register configuration than the legacy ABI. First compiled code is generated based on the source code, the first compiled code comprising code for accommodating the difference in register configurations used by the extended ABI and the legacy ABI. The first compiled code and second compiled code are intermixed to generate intermixed code, the second compiled code being compiled code that uses the legacy ABI. The intermixed code comprises at least one call instruction that is one of a call from the first compiled code to the second compiled code or a call from the second compiled code to the first compiled code. The code for accommodating the difference in register configurations is associated with the at least one call instruction.

  19. Bit-wise arithmetic coding for data compression

    NASA Technical Reports Server (NTRS)

    Kiely, A. B.

    1994-01-01

    This article examines the problem of compressing a uniformly quantized independent and identically distributed (IID) source. We present a new compression technique, bit-wise arithmetic coding, that assigns fixed-length codewords to the quantizer output and uses arithmetic coding to compress the codewords, treating the codeword bits as independent. We examine the performance of this method and evaluate the overhead required when used block-adaptively. Simulation results are presented for Gaussian and Laplacian sources. This new technique could be used as the entropy coder in a transform or subband coding system.

  20. Mechanical code comparator

    DOEpatents

    Peter, Frank J.; Dalton, Larry J.; Plummer, David W.

    2002-01-01

    A new class of mechanical code comparators is described which have broad potential for application in safety, surety, and security applications. These devices can be implemented as micro-scale electromechanical systems that isolate a secure or otherwise controlled device until an access code is entered. This access code is converted into a series of mechanical inputs to the mechanical code comparator, which compares the access code to a pre-input combination, entered previously into the mechanical code comparator by an operator at the system security control point. These devices provide extremely high levels of robust security. Being totally mechanical in operation, an access control system properly based on such devices cannot be circumvented by software attack alone.

  1. Theory of epigenetic coding.

    PubMed

    Elder, D

    1984-06-01

    The logic of genetic control of development may be based on a binary epigenetic code. This paper revises the author's previous scheme dealing with the numerology of annelid metamerism in these terms. Certain features of the code had been deduced to be combinatorial, others not. This paradoxical contrast is resolved here by the interpretation that these features relate to different operations of the code; the combinatiorial to coding identity of units, the non-combinatorial to coding production of units. Consideration of a second paradox in the theory of epigenetic coding leads to a new solution which further provides a basis for epimorphic regeneration, and may in particular throw light on the "regeneration-duplication" phenomenon. A possible test of the model is also put forward. PMID:6748695

  2. Updating the Read Codes

    PubMed Central

    Robinson, David; Comp, Dip; Schulz, Erich; Brown, Philip; Price, Colin

    1997-01-01

    Abstract The Read Codes are a hierarchically-arranged controlled clinical vocabulary introduced in the early 1980s and now consisting of three maintained versions of differing complexity. The code sets are dynamic, and are updated quarterly in response to requests from users including clinicians in both primary and secondary care, software suppliers, and advice from a network of specialist healthcare professionals. The codes' continual evolution of content, both across and within versions, highlights tensions between different users and uses of coded clinical data. Internal processes, external interactions and new structural features implemented by the NHS Centre for Coding and Classification (NHSCCC) for user interactive maintenance of the Read Codes are described, and over 2000 items of user feedback episodes received over a 15-month period are analysed. PMID:9391934

  3. Doubled Color Codes

    NASA Astrophysics Data System (ADS)

    Bravyi, Sergey

    Combining protection from noise and computational universality is one of the biggest challenges in the fault-tolerant quantum computing. Topological stabilizer codes such as the 2D surface code can tolerate a high level of noise but implementing logical gates, especially non-Clifford ones, requires a prohibitively large overhead due to the need of state distillation. In this talk I will describe a new family of 2D quantum error correcting codes that enable a transversal implementation of all logical gates required for the universal quantum computing. Transversal logical gates (TLG) are encoded operations that can be realized by applying some single-qubit rotation to each physical qubit. TLG are highly desirable since they introduce no overhead and do not spread errors. It has been known before that a quantum code can have only a finite number of TLGs which rules out computational universality. Our scheme circumvents this no-go result by combining TLGs of two different quantum codes using the gauge-fixing method pioneered by Paetznick and Reichardt. The first code, closely related to the 2D color code, enables a transversal implementation of all single-qubit Clifford gates such as the Hadamard gate and the π / 2 phase shift. The second code that we call a doubled color code provides a transversal T-gate, where T is the π / 4 phase shift. The Clifford+T gate set is known to be computationally universal. The two codes can be laid out on the honeycomb lattice with two qubits per site such that the code conversion requires parity measurements for six-qubit Pauli operators supported on faces of the lattice. I will also describe numerical simulations of logical Clifford+T circuits encoded by the distance-3 doubled color code. Based on a joint work with Andrew Cross.

  4. Bar Code Labels

    NASA Technical Reports Server (NTRS)

    1988-01-01

    American Bar Codes, Inc. developed special bar code labels for inventory control of space shuttle parts and other space system components. ABC labels are made in a company-developed anodizing aluminum process and consecutively marketed with bar code symbology and human readable numbers. They offer extreme abrasion resistance and indefinite resistance to ultraviolet radiation, capable of withstanding 700 degree temperatures without deterioration and up to 1400 degrees with special designs. They offer high resistance to salt spray, cleaning fluids and mild acids. ABC is now producing these bar code labels commercially or industrial customers who also need labels to resist harsh environments.

  5. FAA Smoke Transport Code

    SciTech Connect

    Domino, Stefan; Luketa-Hanlin, Anay; Gallegos, Carlos

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a code obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.

  6. Expander chunked codes

    NASA Astrophysics Data System (ADS)

    Tang, Bin; Yang, Shenghao; Ye, Baoliu; Yin, Yitong; Lu, Sanglu

    2015-12-01

    Chunked codes are efficient random linear network coding (RLNC) schemes with low computational cost, where the input packets are encoded into small chunks (i.e., subsets of the coded packets). During the network transmission, RLNC is performed within each chunk. In this paper, we first introduce a simple transfer matrix model to characterize the transmission of chunks and derive some basic properties of the model to facilitate the performance analysis. We then focus on the design of overlapped chunked codes, a class of chunked codes whose chunks are non-disjoint subsets of input packets, which are of special interest since they can be encoded with negligible computational cost and in a causal fashion. We propose expander chunked (EC) codes, the first class of overlapped chunked codes that have an analyzable performance, where the construction of the chunks makes use of regular graphs. Numerical and simulation results show that in some practical settings, EC codes can achieve rates within 91 to 97 % of the optimum and outperform the state-of-the-art overlapped chunked codes significantly.

  7. Wear Independent Similarity.

    PubMed

    Steele, Adam; Davis, Alexander; Kim, Joohyung; Loth, Eric; Bayer, Ilker S

    2015-06-17

    This study presents a new factor that can be used to design materials where desired surface properties must be retained under in-system wear and abrasion. To demonstrate this factor, a synthetic nonwetting coating is presented that retains chemical and geometric performance as material is removed under multiple wear conditions: a coarse vitrified abradant (similar to sanding), a smooth abradant (similar to rubbing), and a mild abradant (a blend of sanding and rubbing). With this approach, such a nonwetting material displays unprecedented mechanical durability while maintaining desired performance under a range of demanding conditions. This performance, herein termed wear independent similarity performance (WISP), is critical because multiple mechanisms and/or modes of wear can be expected to occur in many typical applications, e.g., combinations of abrasion, rubbing, contact fatigue, weathering, particle impact, etc. Furthermore, these multiple wear mechanisms tend to quickly degrade a novel surface's unique performance, and thus many promising surfaces and materials never scale out of research laboratories. Dynamic goniometry and scanning electron microscopy results presented herein provide insight into these underlying mechanisms, which may also be applied to other coatings and materials. PMID:26018058

  8. GALPROP: New Developments in CR Propagation Code

    NASA Technical Reports Server (NTRS)

    Moskalenko, I. V.; Jones, F. C.; Mashnik, S. G.; Strong, A. W.; Ptuskin, V. S.

    2003-01-01

    The numerical Galactic CR propagation code GALPROP has been shown to reproduce simultaneously observational data of many kinds related to CR origin and propagation. It has been validated on direct measurements of nuclei, antiprotons, electrons, positrons as well as on astronomical measurements of gamma rays and synchrotron radiation. Such data provide many independent constraints on model parameters while revealing some contradictions in the conventional view of Galactic CR propagation. Using a new version of GALPROP we study new effects such as processes of wave-particle interactions in the interstellar medium. We also report about other developments in the CR propagation code.

  9. The origin and evolution of the genetic code.

    PubMed

    Béland, P; Allen, T F

    1994-10-21

    We argue that a primitive genetic code with only 20 separate words explains that there are 20 coded amino acids in modern life. The existence of 64 words on the modern genetic code requires modern life to read almost exclusively one strand of DNA in one direction. In our primitive code, both the original and the complementary sequence are read in either direction to give the same strings of amino acids. The algebra of complements forces synonymy of primitive codons so as to reduce the 64 independent codons of the modern code to exactly 20 independent separate words in the primitive condition. The synonymy in the modern code is the result of selection rather than algebraic forcing. The primitive code has almost no resilience to base mutations, unlike the third base redundancy of the modern code. Our primitive and the modern code are orthogonal. If palindromic proteins were coded by hairpin DNA or RNA, then (i) no punctuation would be needed; (ii) the reverse reading would give the same secondarily folded protein structure; and (iii) the sugar backbone would be read in the conventional 5' to 3' direction for the original arm and its complement. Modern copying of genetic material is almost always antiparallel. However, occasional parallel copying, as does occur in modern life, would give the complementary hairpin that would also read 5' to 3' along its entire length.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:7996862

  10. P-code enhanced method for processing encrypted GPS signals without knowledge of the encryption code

    NASA Technical Reports Server (NTRS)

    Meehan, Thomas K. (Inventor); Thomas, Jr., Jess Brooks (Inventor); Young, Lawrence E. (Inventor)

    2000-01-01

    In the preferred embodiment, an encrypted GPS signal is down-converted from RF to baseband to generate two quadrature components for each RF signal (L1 and L2). Separately and independently for each RF signal and each quadrature component, the four down-converted signals are counter-rotated with a respective model phase, correlated with a respective model P code, and then successively summed and dumped over presum intervals substantially coincident with chips of the respective encryption code. Without knowledge of the encryption-code signs, the effect of encryption-code sign flips is then substantially reduced by selected combinations of the resulting presums between associated quadrature components for each RF signal, separately and independently for the L1 and L2 signals. The resulting combined presums are then summed and dumped over longer intervals and further processed to extract amplitude, phase and delay for each RF signal. Precision of the resulting phase and delay values is approximately four times better than that obtained from straight cross-correlation of L1 and L2. This improved method provides the following options: separate and independent tracking of the L1-Y and L2-Y channels; separate and independent measurement of amplitude, phase and delay L1-Y channel; and removal of the half-cycle ambiguity in L1-Y and L2-Y carrier phase.

  11. Azerbaijani-Russian Code-Switching and Code-Mixing: Form, Function, and Identity

    ERIC Educational Resources Information Center

    Zuercher, Kenneth

    2009-01-01

    From incorporation into the Russian Empire in 1828, through the collapse of the U.S.S.R. in 1991 governmental language policies and other socio/political forces influenced the Turkic population of the Republic of Azerbaijan to speak Russian. Even with changes since independence Russian use--including various kinds of code-switching and…

  12. Studying the Independent School Library

    ERIC Educational Resources Information Center

    Cahoy, Ellysa Stern; Williamson, Susan G.

    2008-01-01

    In 2005, the American Association of School Librarians' Independent Schools Section conducted a national survey of independent school libraries. This article analyzes the results of the survey, reporting specialized data and information regarding independent school library budgets, collections, services, facilities, and staffing. Additionally, the…

  13. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  14. Radio Losses for Concatenated Codes

    NASA Astrophysics Data System (ADS)

    Shambayati, S.

    2002-07-01

    The advent of higher powered spacecraft amplifiers and better ground receivers capable of tracking spacecraft carrier signals with narrower loop bandwidths requires better understanding of the carrier tracking loss (radio loss) mechanism of the concatenated codes used for deep-space missions. In this article, we present results of simulations performed for a (7,1/2), Reed-Solomon (255,223), interleaver depth-5 concatenated code in order to shed some light on this issue. Through these simulations, we obtained the performance of this code over an additive white Gaussian noise (AWGN) channel (the baseline performance) in terms of both its frame-error rate (FER) and its bit-error rate at the output of the Reed-Solomon decoder (RS-BER). After obtaining these results, we curve fitted the baseline performance curves for FER and RS-BER and calculated the high-rate radio losses for this code for an FER of 10^(-4) and its corresponding baseline RS-BER of 2.1 x 10^(-6) for a carrier loop signal-to-noise ratio (SNR) of 14.8 dB. This calculation revealed that even though over the AWGN channel the FER value and the RS-BER value correspond to each other (i.e., these values are obtained by the same bit SNR value), the RS-BER value has higher high-rate losses than does the FER value. Furthermore, this calculation contradicted the previous assumption th at at high data rates concatenated codes have the same radio losses as their constituent convolutional codes. Our results showed much higher losses for the FER and the RS-BER (by as much as 2 dB) than for the corresponding baseline BER of the convolutional code. Further simulations were performed to investigate the effects of changes in the data rate on the code's radio losses. It was observed that as the data rate increased the radio losses for both the FER and the RS-BER approached their respective calculated high-rate values. Furthermore, these simulations showed that a simple two-parameter function could model the increase in the

  15. Research on Universal Combinatorial Coding

    PubMed Central

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  16. Research on universal combinatorial coding.

    PubMed

    Lu, Jun; Zhang, Zhuo; Mo, Juan

    2014-01-01

    The conception of universal combinatorial coding is proposed. Relations exist more or less in many coding methods. It means that a kind of universal coding method is objectively existent. It can be a bridge connecting many coding methods. Universal combinatorial coding is lossless and it is based on the combinatorics theory. The combinational and exhaustive property make it closely related with the existing code methods. Universal combinatorial coding does not depend on the probability statistic characteristic of information source, and it has the characteristics across three coding branches. It has analyzed the relationship between the universal combinatorial coding and the variety of coding method and has researched many applications technologies of this coding method. In addition, the efficiency of universal combinatorial coding is analyzed theoretically. The multicharacteristic and multiapplication of universal combinatorial coding are unique in the existing coding methods. Universal combinatorial coding has theoretical research and practical application value. PMID:24772019

  17. A reduced complexity highly power/bandwidth efficient coded FQPSK system with iterative decoding

    NASA Technical Reports Server (NTRS)

    Simon, M. K.; Divsalar, D.

    2001-01-01

    Based on a representation of FQPSK as a trellis-coded modulation, this paper investigates the potential improvement in power efficiency obtained from the application of simple outer codes to form a concatenated coding arrangement with iterative decoding.

  18. Fast Coding Unit Encoding Mechanism for Low Complexity Video Coding

    PubMed Central

    Wu, Yueying; Jia, Kebin; Gao, Guandong

    2016-01-01

    In high efficiency video coding (HEVC), coding tree contributes to excellent compression performance. However, coding tree brings extremely high computational complexity. Innovative works for improving coding tree to further reduce encoding time are stated in this paper. A novel low complexity coding tree mechanism is proposed for HEVC fast coding unit (CU) encoding. Firstly, this paper makes an in-depth study of the relationship among CU distribution, quantization parameter (QP) and content change (CC). Secondly, a CU coding tree probability model is proposed for modeling and predicting CU distribution. Eventually, a CU coding tree probability update is proposed, aiming to address probabilistic model distortion problems caused by CC. Experimental results show that the proposed low complexity CU coding tree mechanism significantly reduces encoding time by 27% for lossy coding and 42% for visually lossless coding and lossless coding. The proposed low complexity CU coding tree mechanism devotes to improving coding performance under various application conditions. PMID:26999741

  19. Quality assurance and verification of the MACCS (MELCOR Accident Consequence Code System) code, Version 1. 5

    SciTech Connect

    Dobbe, C.A.; Carlson, E.R.; Marshall, N.H.; Marwil, E.S.; Tolli, J.E. )

    1990-02-01

    An independent quality assurance (QA) and verification of Version 1.5 of the MELCOR Accident Consequence Code System (MACCS) was performed. The QA and verification involved examination of the code and associated documentation for consistent and correct implementation of the models in an error-free FORTRAN computer code. The QA and verification was not intended to determine either the adequacy or appropriateness of the models that are used MACCS 1.5. The reviews uncovered errors which were fixed by the SNL MACCS code development staff prior to the release of MACCS 1.5. Some difficulties related to documentation improvement and code restructuring are also presented. The QA and verification process concluded that Version 1.5 of the MACCS code, within the scope and limitations process concluded that Version 1.5 of the MACCS code, within the scope and limitations of the models implemented in the code is essentially error free and ready for widespread use. 15 refs., 11 tabs.

  20. Melanism in Peromyscus Is Caused by Independent Mutations in Agouti

    PubMed Central

    Kingsley, Evan P.; Manceau, Marie; Wiley, Christopher D.; Hoekstra, Hopi E.

    2009-01-01

    Identifying the molecular basis of phenotypes that have evolved independently can provide insight into the ways genetic and developmental constraints influence the maintenance of phenotypic diversity. Melanic (darkly pigmented) phenotypes in mammals provide a potent system in which to study the genetic basis of naturally occurring mutant phenotypes because melanism occurs in many mammals, and the mammalian pigmentation pathway is well understood. Spontaneous alleles of a few key pigmentation loci are known to cause melanism in domestic or laboratory populations of mammals, but in natural populations, mutations at one gene, the melanocortin-1 receptor (Mc1r), have been implicated in the vast majority of cases, possibly due to its minimal pleiotropic effects. To investigate whether mutations in this or other genes cause melanism in the wild, we investigated the genetic basis of melanism in the rodent genus Peromyscus, in which melanic mice have been reported in several populations. We focused on two genes known to cause melanism in other taxa, Mc1r and its antagonist, the agouti signaling protein (Agouti). While variation in the Mc1r coding region does not correlate with melanism in any population, in a New Hampshire population, we find that a 125-kb deletion, which includes the upstream regulatory region and exons 1 and 2 of Agouti, results in a loss of Agouti expression and is perfectly associated with melanic color. In a second population from Alaska, we find that a premature stop codon in exon 3 of Agouti is associated with a similar melanic phenotype. These results show that melanism has evolved independently in these populations through mutations in the same gene, and suggest that melanism produced by mutations in genes other than Mc1r may be more common than previously thought. PMID:19649329

  1. Lichenase and coding sequences

    DOEpatents

    Li, Xin-Liang; Ljungdahl, Lars G.; Chen, Huizhong

    2000-08-15

    The present invention provides a fungal lichenase, i.e., an endo-1,3-1,4-.beta.-D-glucanohydrolase, its coding sequence, recombinant DNA molecules comprising the lichenase coding sequences, recombinant host cells and methods for producing same. The present lichenase is from Orpinomyces PC-2.

  2. Codes of Conduct

    ERIC Educational Resources Information Center

    Million, June

    2004-01-01

    Most schools have a code of conduct, pledge, or behavioral standards, set by the district or school board with the school community. In this article, the author features some schools that created a new vision of instilling code of conducts to students based on work quality, respect, safety and courtesy. She suggests that communicating the code…

  3. Code of Ethics

    ERIC Educational Resources Information Center

    Division for Early Childhood, Council for Exceptional Children, 2009

    2009-01-01

    The Code of Ethics of the Division for Early Childhood (DEC) of the Council for Exceptional Children is a public statement of principles and practice guidelines supported by the mission of DEC. The foundation of this Code is based on sound ethical reasoning related to professional practice with young children with disabilities and their families…

  4. Legacy Code Modernization

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Frumkin, Michael; Jin, Haoqiang; Waheed, Abdul; Yan, Jerry; Saini, Subhash (Technical Monitor)

    1998-01-01

    Over the past decade, high performance computing has evolved rapidly; systems based on commodity microprocessors have been introduced in quick succession from at least seven vendors/families. Porting codes to every new architecture is a difficult problem; in particular, here at NASA, there are many large CFD applications that are very costly to port to new machines by hand. The LCM ("Legacy Code Modernization") Project is the development of an integrated parallelization environment (IPE) which performs the automated mapping of legacy CFD (Fortran) applications to state-of-the-art high performance computers. While most projects to port codes focus on the parallelization of the code, we consider porting to be an iterative process consisting of several steps: 1) code cleanup, 2) serial optimization,3) parallelization, 4) performance monitoring and visualization, 5) intelligent tools for automated tuning using performance prediction and 6) machine specific optimization. The approach for building this parallelization environment is to build the components for each of the steps simultaneously and then integrate them together. The demonstration will exhibit our latest research in building this environment: 1. Parallelizing tools and compiler evaluation. 2. Code cleanup and serial optimization using automated scripts 3. Development of a code generator for performance prediction 4. Automated partitioning 5. Automated insertion of directives. These demonstrations will exhibit the effectiveness of an automated approach for all the steps involved with porting and tuning a legacy code application for a new architecture.

  5. Modified JPEG Huffman coding.

    PubMed

    Lakhani, Gopal

    2003-01-01

    It is a well observed characteristic that when a DCT block is traversed in the zigzag order, the AC coefficients generally decrease in size and the run-length of zero coefficients increase in number. This article presents a minor modification to the Huffman coding of the JPEG baseline compression algorithm to exploit this redundancy. For this purpose, DCT blocks are divided into bands so that each band can be coded using a separate code table. Three implementations are presented, which all move the end-of-block marker up in the middle of DCT block and use it to indicate the band boundaries. Experimental results are presented to compare reduction in the code size obtained by our methods with the JPEG sequential-mode Huffman coding and arithmetic coding methods. The average code reduction to the total image code size of one of our methods is 4%. Our methods can also be used for progressive image transmission and hence, experimental results are also given to compare them with two-, three-, and four-band implementations of the JPEG spectral selection method. PMID:18237897

  6. Binary concatenated coding system

    NASA Technical Reports Server (NTRS)

    Monford, L. G., Jr.

    1973-01-01

    Coding, using 3-bit binary words, is applicable to any measurement having integer scale up to 100. System using 6-bit data words can be expanded to read from 1 to 10,000, and 9-bit data words can increase range to 1,000,000. Code may be ''read'' directly by observation after memorizing simple listing of 9's and 10's.

  7. Computerized mega code recording.

    PubMed

    Burt, T W; Bock, H C

    1988-04-01

    A system has been developed to facilitate recording of advanced cardiac life support mega code testing scenarios. By scanning a paper "keyboard" using a bar code wand attached to a portable microcomputer, the person assigned to record the scenario can easily generate an accurate, complete, timed, and typewritten record of the given situations and the obtained responses. PMID:3354937

  8. Coding for optical channels

    NASA Technical Reports Server (NTRS)

    Baumert, L. D.; Mceliece, R. J.; Rumsey, H., Jr.

    1979-01-01

    In a previous paper Pierce considered the problem of optical communication from a novel viewpoint, and concluded that performance will likely be limited by issues of coding complexity rather than by thermal noise. This paper reviews the model proposed by Pierce and presents some results on the analysis and design of codes for this application.

  9. Combustion chamber analysis code

    NASA Technical Reports Server (NTRS)

    Przekwas, A. J.; Lai, Y. G.; Krishnan, A.; Avva, R. K.; Giridharan, M. G.

    1993-01-01

    A three-dimensional, time dependent, Favre averaged, finite volume Navier-Stokes code has been developed to model compressible and incompressible flows (with and without chemical reactions) in liquid rocket engines. The code has a non-staggered formulation with generalized body-fitted-coordinates (BFC) capability. Higher order differencing methodologies such as MUSCL and Osher-Chakravarthy schemes are available. Turbulent flows can be modeled using any of the five turbulent models present in the code. A two-phase, two-liquid, Lagrangian spray model has been incorporated into the code. Chemical equilibrium and finite rate reaction models are available to model chemically reacting flows. The discrete ordinate method is used to model effects of thermal radiation. The code has been validated extensively against benchmark experimental data and has been applied to model flows in several propulsion system components of the SSME and the STME.

  10. Energy Conservation Code Decoded

    SciTech Connect

    Cole, Pam C.; Taylor, Zachary T.

    2006-09-01

    Designing an energy-efficient, affordable, and comfortable home is a lot easier thanks to a slime, easier to read booklet, the 2006 International Energy Conservation Code (IECC), published in March 2006. States, counties, and cities have begun reviewing the new code as a potential upgrade to their existing codes. Maintained under the public consensus process of the International Code Council, the IECC is designed to do just what its title says: promote the design and construction of energy-efficient homes and commercial buildings. Homes in this case means traditional single-family homes, duplexes, condominiums, and apartment buildings having three or fewer stories. The U.S. Department of Energy, which played a key role in proposing the changes that resulted in the new code, is offering a free training course that covers the residential provisions of the 2006 IECC.

  11. Astrophysics Source Code Library

    NASA Astrophysics Data System (ADS)

    Allen, A.; DuPrie, K.; Berriman, B.; Hanisch, R. J.; Mink, J.; Teuben, P. J.

    2013-10-01

    The Astrophysics Source Code Library (ASCL), founded in 1999, is a free on-line registry for source codes of interest to astronomers and astrophysicists. The library is housed on the discussion forum for Astronomy Picture of the Day (APOD) and can be accessed at http://ascl.net. The ASCL has a comprehensive listing that covers a significant number of the astrophysics source codes used to generate results published in or submitted to refereed journals and continues to grow. The ASCL currently has entries for over 500 codes; its records are citable and are indexed by ADS. The editors of the ASCL and members of its Advisory Committee were on hand at a demonstration table in the ADASS poster room to present the ASCL, accept code submissions, show how the ASCL is starting to be used by the astrophysics community, and take questions on and suggestions for improving the resource.

  12. The Comparative Performance of Conditional Independence Indices

    ERIC Educational Resources Information Center

    Kim, Doyoung; De Ayala, R. J.; Ferdous, Abdullah A.; Nering, Michael L.

    2011-01-01

    To realize the benefits of item response theory (IRT), one must have model-data fit. One facet of a model-data fit investigation involves assessing the tenability of the conditional item independence (CII) assumption. In this Monte Carlo study, the comparative performance of 10 indices for identifying conditional item dependence is assessed. The…

  13. Who Succeeds in an Independent, Open Laboratory?

    ERIC Educational Resources Information Center

    Halyard, Rebecca A.; And Others

    This paper reports a study which investigated student characteristics for predicting success in an independent, open laboratory. Data were gathered from introductory biology students (N=98) enrolled in a public, two-year college near Atlanta, Georgia. Stepwise multiple regression analysis was used to determine the effects of age, sex, laboratory…

  14. Arithmetic coding as a non-linear dynamical system

    NASA Astrophysics Data System (ADS)

    Nagaraj, Nithin; Vaidya, Prabhakar G.; Bhat, Kishor G.

    2009-04-01

    In order to perform source coding (data compression), we treat messages emitted by independent and identically distributed sources as imprecise measurements (symbolic sequence) of a chaotic, ergodic, Lebesgue measure preserving, non-linear dynamical system known as Generalized Luröth Series (GLS). GLS achieves Shannon's entropy bound and turns out to be a generalization of arithmetic coding, a popular source coding algorithm, used in international compression standards such as JPEG2000 and H.264. We further generalize GLS to piecewise non-linear maps (Skewed-nGLS). We motivate the use of Skewed-nGLS as a framework for joint source coding and encryption.

  15. Performance of concatenated Reed-Solomon trellis-coded modulation over Rician fading channels

    NASA Technical Reports Server (NTRS)

    Moher, Michael L.; Lodge, John H.

    1990-01-01

    A concatenated coding scheme for providing very reliable data over mobile-satellite channels at power levels similar to those used for vocoded speech is described. The outer code is a shorter Reed-Solomon code which provides error detection as well as error correction capabilities. The inner code is a 1-D 8-state trellis code applied independently to both the inphase and quadrature channels. To achieve the full error correction potential of this inner code, the code symbols are multiplexed with a pilot sequence which is used to provide dynamic channel estimation and coherent detection. The implementation structure of this scheme is discussed and its performance is estimated.

  16. Value of Laboratory Experiments for Code Validations

    SciTech Connect

    Wawersik, W.R.

    1998-12-14

    Numerical codes have become indispensable for designing underground structures and interpretating the behavior of geologic systems. Because of the complexities of geologic systems, however, code calculations often are associated with large quantitative uncertainties. This papers presents three examples to demonstrate the value of laboratory(or bench scale) experiments to evaluate the predictive capabilities of such codes with five major conclusions: Laboratory or bench-scale experiments are a very cost-effective, controlled means of evaluating and validating numerical codes, not instead of but before or at least concurrent with the implementation of in situ studies. The design of good laboratory validation tests must identifj what aspects of a code are to be scrutinized in order to optimize the size, geometry, boundary conditions, and duration of the experiments. The design of good and sometimes difficult numerical analyses and sensitivity studies. Laboratory validation tests must involve: Good validation experiments will generate independent data sets to identify the combined effect of constitutive models, model generalizations, material parameters, and numerical algorithms. Successfid validations of numerical codes mandate a close collaboration between experimentalists and analysts drawing from the full gamut of observations, measurements, and mathematical results.

  17. Distributed transform coding via source-splitting

    NASA Astrophysics Data System (ADS)

    Yahampath, Pradeepa

    2012-12-01

    Transform coding (TC) is one of the best known practical methods for quantizing high-dimensional vectors. In this article, a practical approach to distributed TC of jointly Gaussian vectors is presented. This approach, referred to as source-split distributed transform coding (SP-DTC), can be used to easily implement two terminal transform codes for any given rate-pair. The main idea is to apply source-splitting using orthogonal-transforms, so that only Wyner-Ziv (WZ) quantizers are required for compression of transform coefficients. This approach however requires optimizing the bit allocation among dependent sets of WZ quantizers. In order to solve this problem, a low-complexity tree-search algorithm based on analytical models for transform coefficient quantization is developed. A rate-distortion (RD) analysis of SP-DTCs for jointly Gaussian sources is presented, which indicates that these codes can significantly outperform the practical alternative of independent TC of each source, whenever there is a strong correlation between the sources. For practical implementation of SP-DTCs, the idea of using conditional entropy constrained (CEC) quantizers followed by Slepian-Wolf coding is explored. Experimental results obtained with SP-DTC designs based on both CEC scalar quantizers and CEC trellis-coded quantizers demonstrate that actual implementations of SP-DTCs can achieve RD performance close to the analytically predicted limits.

  18. A distributed code for color in natural scenes derived from center-surround filtered cone signals

    PubMed Central

    Kellner, Christian J.; Wachtler, Thomas

    2013-01-01

    In the retina of trichromatic primates, chromatic information is encoded in an opponent fashion and transmitted to the lateral geniculate nucleus (LGN) and visual cortex via parallel pathways. Chromatic selectivities of neurons in the LGN form two separate clusters, corresponding to two classes of cone opponency. In the visual cortex, however, the chromatic selectivities are more distributed, which is in accordance with a population code for color. Previous studies of cone signals in natural scenes typically found opponent codes with chromatic selectivities corresponding to two directions in color space. Here we investigated how the non-linear spatio-chromatic filtering in the retina influences the encoding of color signals. Cone signals were derived from hyper-spectral images of natural scenes and preprocessed by center-surround filtering and rectification, resulting in parallel ON and OFF channels. Independent Component Analysis (ICA) on these signals yielded a highly sparse code with basis functions that showed spatio-chromatic selectivities. In contrast to previous analyses of linear transformations of cone signals, chromatic selectivities were not restricted to two main chromatic axes, but were more continuously distributed in color space, similar to the population code of color in the early visual cortex. Our results indicate that spatio-chromatic processing in the retina leads to a more distributed and more efficient code for natural scenes. PMID:24098289

  19. Clinical Reasoning of Physical Therapists regarding In-hospital Walking Independence of Patients with Hemiplegia

    PubMed Central

    Takahashi, Junpei; Takami, Akiyoshi; Wakayama, Saichi

    2014-01-01

    [Purpose] Physical therapists must often determine whether hemiparetic patients can walk independently. However, there are no criteria, so decisions are often left to individual physical therapists. The purpose of this study was to explore how physical therapists determine whether a patient with hemiplegia can walk independently in a ward. [Methods] The subjects were 15 physical therapists with experience of stroke patients’ rehabilitation. We interviewed them using semi-structured interviews related to the criteria of the states of walking in the ward of hemiparetic patients. The interviews were transcribed in full, and the texts were analyzed by coding and grouping. [Results] From the results of the interviews, PTs determined patients’ independence of walking in hospital by observation of behavior during walking or treatment. The majority of PTs focused on the patients’ state during walking, higher brain function, and their ability to balance. In addition, they often asked ward staff about patients’ daily life, and self-determination. [Conclusions] We identified the items examined by physical therapists when determining the in-hospital walking independence of stroke patients. Further investigation is required to examine which of these items are truly necessary. PMID:24926149

  20. MHDust: A 3-fluid dusty plasma code

    NASA Astrophysics Data System (ADS)

    Lazerson, Samuel

    MHDust is a next generation 3-fluid magnetized dusty plasma code, treating the inertial dynamics of both the dust and ion components. Coded in ANSI C, the numerical method employs Leap-Frog and Dufort-Frankel integration schemes. Features include: nonlinear collisional terms, quasi-neutrality or continuity based electron densities, and dynamical dust charge number. Tests of wave-mode propagation (Acoustic and Electromagnetic) allow a comparison to linear wave mode theory. Additional nonlinear phenomena are presented including magnetic reconnection and shear-flow instabilities. Relevant parameters for the space environment are considered, allowing a comparison to be made with previous dusty plasma codes (DENISIS). The utility of the code is expanded through the possibility of small dust mass. This allows MH- Dust to be used as a 2-ion plasma code. MHDust considerably expands the range of numerical investigations into nonlinear phenomena in the field of astrophysical dusty plasmas.

  1. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 4 2010-07-01 2010-07-01 true Installation traffic codes. 634.25 Section 634.25... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Traffic Supervision § 634.25 Installation traffic codes. (a) Installation or activity commanders will establish a traffic code for operation of...

  2. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 32 National Defense 4 2014-07-01 2013-07-01 true Installation traffic codes. 634.25 Section 634.25... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Traffic Supervision § 634.25 Installation traffic codes. (a) Installation or activity commanders will establish a traffic code for operation of...

  3. 32 CFR 634.25 - Installation traffic codes.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 32 National Defense 4 2012-07-01 2011-07-01 true Installation traffic codes. 634.25 Section 634.25... CRIMINAL INVESTIGATIONS MOTOR VEHICLE TRAFFIC SUPERVISION Traffic Supervision § 634.25 Installation traffic codes. (a) Installation or activity commanders will establish a traffic code for operation of...

  4. A robust low-rate coding scheme for packet video

    NASA Technical Reports Server (NTRS)

    Chen, Y. C.; Sayood, Khalid; Nelson, D. J.; Arikan, E. (Editor)

    1991-01-01

    Due to the rapidly evolving field of image processing and networking, video information promises to be an important part of telecommunication systems. Although up to now video transmission has been transported mainly over circuit-switched networks, it is likely that packet-switched networks will dominate the communication world in the near future. Asynchronous transfer mode (ATM) techniques in broadband-ISDN can provide a flexible, independent and high performance environment for video communication. For this paper, the network simulator was used only as a channel in this simulation. Mixture blocking coding with progressive transmission (MBCPT) has been investigated for use over packet networks and has been found to provide high compression rate with good visual performance, robustness to packet loss, tractable integration with network mechanics and simplicity in parallel implementation.

  5. Quantum convolutional codes derived from constacyclic codes

    NASA Astrophysics Data System (ADS)

    Yan, Tingsu; Huang, Xinmei; Tang, Yuansheng

    2014-12-01

    In this paper, three families of quantum convolutional codes are constructed. The first one and the second one can be regarded as a generalization of Theorems 3, 4, 7 and 8 [J. Chen, J. Li, F. Yang and Y. Huang, Int. J. Theor. Phys., doi:10.1007/s10773-014-2214-6 (2014)], in the sense that we drop the constraint q ≡ 1 (mod 4). Furthermore, the second one and the third one attain the quantum generalized Singleton bound.

  6. Huffman coding in advanced audio coding standard

    NASA Astrophysics Data System (ADS)

    Brzuchalski, Grzegorz

    2012-05-01

    This article presents several hardware architectures of Advanced Audio Coding (AAC) Huffman noiseless encoder, its optimisations and working implementation. Much attention has been paid to optimise the demand of hardware resources especially memory size. The aim of design was to get as short binary stream as possible in this standard. The Huffman encoder with whole audio-video system has been implemented in FPGA devices.

  7. Report number codes

    SciTech Connect

    Nelson, R.N.

    1985-05-01

    This publication lists all report number codes processed by the Office of Scientific and Technical Information. The report codes are substantially based on the American National Standards Institute, Standard Technical Report Number (STRN)-Format and Creation Z39.23-1983. The Standard Technical Report Number (STRN) provides one of the primary methods of identifying a specific technical report. The STRN consists of two parts: The report code and the sequential number. The report code identifies the issuing organization, a specific program, or a type of document. The sequential number, which is assigned in sequence by each report issuing entity, is not included in this publication. Part I of this compilation is alphabetized by report codes followed by issuing installations. Part II lists the issuing organization followed by the assigned report code(s). In both Parts I and II, the names of issuing organizations appear for the most part in the form used at the time the reports were issued. However, for some of the more prolific installations which have had name changes, all entries have been merged under the current name.

  8. Information coding in artificial olfaction multisensor arrays.

    PubMed

    Albert, Keith J; Walt, David R

    2003-08-15

    High-density sensor arrays were prepared with microbead vapor sensors to explore and compare the information coded in sensor response profiles following odor stimulus. The coded information in the sensor-odor response profiles, which is used for odor discrimination purposes, was extracted from the microsensor arrays via two different approaches. In the first approach, the responses from individual microsensors were separated (decoded array) and independently processed. In the second approach, response profiles from all microsensors within the entire array, i.e., the sensor ensemble, were combined to create one response per odor stimulus (nondecoded array). Although the amount of response data is markedly reduced in the second approach, the system shows comparable odor discrimination rates for the two signal extraction methods. The ensemble approach streamlines system resources without decreasing system performance. These signal compression approaches may simulate or parallel information coding in the mammalian olfactory system. PMID:14632130

  9. 3D neutronic codes coupled with thermal-hydraulic system codes for PWR, and BWR and VVER reactors

    SciTech Connect

    Langenbuch, S.; Velkov, K.; Lizorkin, M.

    1997-07-01

    This paper describes the objectives of code development for coupling 3D neutronics codes with thermal-hydraulic system codes. The present status of coupling ATHLET with three 3D neutronics codes for VVER- and LWR-reactors is presented. After describing the basic features of the 3D neutronic codes BIPR-8 from Kurchatov-Institute, DYN3D from Research Center Rossendorf and QUABOX/CUBBOX from GRS, first applications of coupled codes for different transient and accident scenarios are presented. The need of further investigations is discussed.

  10. Honor Codes and Other Contextual Influences on Academic Integrity: A Replication and Extension to Modified Honor Code Settings.

    ERIC Educational Resources Information Center

    McCabe, Donald L.; Trevino, Linda Klebe; Butterfield, Kenneth D.

    2002-01-01

    Investigated the influence of modified honor codes, an alternative to traditional codes that is gaining popularity on larger campuses. Also tested the model of student academic dishonesty previously suggested by McCabe and Trevino. Found that modified honor codes are associated with lower levels of student dishonesty and that the McCabe Trevino…

  11. TRANSF code user manual

    SciTech Connect

    Weaver, H.J.

    1981-11-01

    The TRANSF code is a semi-interactive FORTRAN IV program which is designed to calculate the model parameters of a (structural) system by performing a least square parameter fit to measured transfer function data. The code is available at LLNL on both the 7600 and the Cray machines. The transfer function data to be fit is read into the code via a disk file. The primary mode of output is FR80 graphics, although, it is also possible to have results written to either the TTY or to a disk file.

  12. Local intensity adaptive image coding

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1989-01-01

    The objective of preprocessing for machine vision is to extract intrinsic target properties. The most important properties ordinarily are structure and reflectance. Illumination in space, however, is a significant problem as the extreme range of light intensity, stretching from deep shadow to highly reflective surfaces in direct sunlight, impairs the effectiveness of standard approaches to machine vision. To overcome this critical constraint, an image coding scheme is being investigated which combines local intensity adaptivity, image enhancement, and data compression. It is very effective under the highly variant illumination that can exist within a single frame or field of view, and it is very robust to noise at low illuminations. Some of the theory and salient features of the coding scheme are reviewed. Its performance is characterized in a simulated space application, the research and development activities are described.

  13. Theory of quantum error-correcting codes

    SciTech Connect

    Knill, E.; Laflamme, R.

    1997-02-01

    Quantum error correction will be necessary for preserving coherent states against noise and other unwanted interactions in quantum computation and communication. We develop a general theory of quantum error correction based on encoding states into larger Hilbert spaces subject to known interactions. We obtain necessary and sufficient conditions for the perfect recovery of an encoded state after its degradation by an interaction. The conditions depend only on the behavior of the logical states. We use them to give a recovery-operator-independent definition of error-correcting codes. We relate this definition to four others: the existence of a left inverse of the interaction, an explicit representation of the error syndrome using tensor products, perfect recovery of the completely entangled state, and an information theoretic identity. Two notions of fidelity and error for imperfect recovery are introduced, one for pure and the other for entangled states. The latter is more appropriate when using codes in a quantum memory or in applications of quantum teleportation to communication. We show that the error for entangled states is bounded linearly by the error for pure states. A formal definition of independent interactions for qubits is given. This leads to lower bounds on the number of qubits required to correct e errors and a formal proof that the classical bounds on the probability of error of e-error-correcting codes applies to e-error-correcting quantum codes, provided that the interaction is dominated by an identity component. {copyright} {ital 1997} {ital The American Physical Society}

  14. Applications of numerical codes to space plasma problems

    NASA Technical Reports Server (NTRS)

    Northrop, T. G.; Birmingham, T. J.; Jones, F. C.; Wu, C. S.

    1975-01-01

    Solar wind, earth's bowshock, and magnetospheric convection and substorms were investigated. Topics discussed include computational physics, multifluid codes, ionospheric irregularities, and modeling laser plasmas.

  15. Optimal source codes for geometrically distributed integer alphabets

    NASA Technical Reports Server (NTRS)

    Gallager, R. G.; Van Voorhis, D. C.

    1975-01-01

    An approach is shown for using the Huffman algorithm indirectly to prove the optimality of a code for an infinite alphabet if an estimate concerning the nature of the code can be made. Attention is given to nonnegative integers with a geometric probability assignment. The particular distribution considered arises in run-length coding and in encoding protocol information in data networks. Questions of redundancy of the optimal code are also investigated.

  16. Trellis coding with multidimensional QAM signal sets

    NASA Technical Reports Server (NTRS)

    Pietrobon, Steven S.; Costello, Daniel J.

    1993-01-01

    Trellis coding using multidimensional QAM signal sets is investigated. Finite-size 2D signal sets are presented that have minimum average energy, are 90-deg rotationally symmetric, and have from 16 to 1024 points. The best trellis codes using the finite 16-QAM signal set with two, four, six, and eight dimensions are found by computer search (the multidimensional signal set is constructed from the 2D signal set). The best moderate complexity trellis codes for infinite lattices with two, four, six, and eight dimensions are also found. The minimum free squared Euclidean distance and number of nearest neighbors for these codes were used as the selection criteria. Many of the multidimensional codes are fully rotationally invariant and give asymptotic coding gains up to 6.0 dB. From the infinite lattice codes, the best codes for transmitting J, J + 1/4, J + 1/3, J + 1/2, J + 2/3, and J + 3/4 bit/sym (J an integer) are presented.

  17. FORTRAN code-evaluation system

    NASA Technical Reports Server (NTRS)

    Capps, J. D.; Kleir, R.

    1977-01-01

    Automated code evaluation system can be used to detect coding errors and unsound coding practices in any ANSI FORTRAN IV source code before they can cause execution-time malfunctions. System concentrates on acceptable FORTRAN code features which are likely to produce undesirable results.

  18. Multi-level bandwidth efficient block modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu

    1989-01-01

    The multilevel technique is investigated for combining block coding and modulation. There are four parts. In the first part, a formulation is presented for signal sets on which modulation codes are to be constructed. Distance measures on a signal set are defined and their properties are developed. In the second part, a general formulation is presented for multilevel modulation codes in terms of component codes with appropriate Euclidean distances. The distance properties, Euclidean weight distribution and linear structure of multilevel modulation codes are investigated. In the third part, several specific methods for constructing multilevel block modulation codes with interdependency among component codes are proposed. Given a multilevel block modulation code C with no interdependency among the binary component codes, the proposed methods give a multilevel block modulation code C which has the same rate as C, a minimum squared Euclidean distance not less than that of code C, a trellis diagram with the same number of states as that of C and a smaller number of nearest neighbor codewords than that of C. In the last part, error performance of block modulation codes is analyzed for an AWGN channel based on soft-decision maximum likelihood decoding. Error probabilities of some specific codes are evaluated based on their Euclidean weight distributions and simulation results.

  19. Statistical mechanics of error-correcting codes

    NASA Astrophysics Data System (ADS)

    Kabashima, Y.; Saad, D.

    1999-01-01

    We investigate the performance of error-correcting codes, where the code word comprises products of K bits selected from the original message and decoding is carried out utilizing a connectivity tensor with C connections per index. Shannon's bound for the channel capacity is recovered for large K and zero temperature when the code rate K/C is finite. Close to optimal error-correcting capability is obtained for finite K and C. We examine the finite-temperature case to assess the use of simulated annealing for decoding and extend the analysis to accommodate other types of noisy channels.

  20. Methodology, status and plans for development and assessment of the code ATHLET

    SciTech Connect

    Teschendorff, V.; Austregesilo, H.; Lerchl, G.

    1997-07-01

    The thermal-hydraulic computer code ATHLET (Analysis of THermal-hydraulics of LEaks and Transients) is being developed by the Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) for the analysis of anticipated and abnormal plant transients, small and intermediate leaks as well as large breaks in light water reactors. The aim of the code development is to cover the whole spectrum of design basis and beyond design basis accidents (without core degradation) for PWRs and BWRs with only one code. The main code features are: advanced thermal-hydraulics; modular code architecture; separation between physical models and numerical methods; pre- and post-processing tools; portability. The code has features that are of special interest for applications to small leaks and transients with accident management, e.g. initialization by a steady-state calculation, full-range drift-flux model, dynamic mixture level tracking. The General Control Simulation Module of ATHLET is a flexible tool for the simulation of the balance-of-plant and control systems including the various operator actions in the course of accident sequences with AM measures. The code development is accompained by a systematic and comprehensive validation program. A large number of integral experiments and separate effect tests, including the major International Standard Problems, have been calculated by GRS and by independent organizations. The ATHLET validation matrix is a well balanced set of integral and separate effects tests derived from the CSNI proposal emphasizing, however, the German combined ECC injection system which was investigated in the UPTF, PKL and LOBI test facilities.

  1. FAST2 Code validation

    SciTech Connect

    Wilson, R.E.; Freeman, L.N.; Walker, S.N.

    1995-09-01

    The FAST2 Code which is capable of determining structural loads of a flexible, teetering, horizontal axis wind turbine is described and comparisons of calculated loads with test data at two wind speeds for the ESI-80 are given. The FAST2 Code models a two-bladed HAWT with degrees of freedom for blade flap, teeter, drive train flexibility, yaw, and windwise and crosswind tower motion. The code allows blade dimensions, stiffness, and weights to differ and models tower shadow, wind shear, and turbulence. Additionally, dynamic stall is included as are delta-3 and an underslung rotor. Load comparisons are made with ESI-80 test data in the form of power spectral density, rainflow counting, occurrence histograms and azimuth averaged bin plots. It is concluded that agreement between the FAST2 Code and test results is good.

  2. Compressible Astrophysics Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2007-07-18

    This is an astrophysics simulation code involving a radiation diffusion module developed at LLNL coupled to compressible hydrodynamics and adaptive mesh infrastructure developed at LBNL. One intended application is to neutrino diffusion in core collapse supernovae.

  3. Seals Flow Code Development

    NASA Technical Reports Server (NTRS)

    1991-01-01

    In recognition of a deficiency in the current modeling capability for seals, an effort was established by NASA to develop verified computational fluid dynamic concepts, codes, and analyses for seals. The objectives were to develop advanced concepts for the design and analysis of seals, to effectively disseminate the information to potential users by way of annual workshops, and to provide experimental verification for the models and codes under a wide range of operating conditions.

  4. Knowledge and Performance about Nursing Ethic Codes from Nurses' and Patients' Perspective in Tabriz Teaching Hospitals, Iran

    PubMed Central

    Mohajjel-Aghdam, Alireza; Hassankhani, Hadi; Zamanzadeh, Vahid; Khameneh, Saied; Moghaddam, Sara

    2013-01-01

    Introduction: Nursing profession requires knowledge of ethics to guide performance. The nature of this profession necessitates ethical care more than routine care. Today, worldwide definition of professional ethic code has been done based on human and ethical issues in the communication between nurse and patient. To improve all dimensions of nursing, we need to respect ethic codes. The aim of this study is to assess knowledge and performance about nursing ethic codes from nurses' and patients' perspective. Methods: A descriptive study Conducted upon 345 nurses and 500 inpatients in six teaching hospitals of Tabriz, 2012. To investigate nurses' knowledge and performance, data were collected by using structured questionnaires. Statistical analysis was done using descriptive and analytic statistics, independent t-test and ANOVA and Pearson correlation coefficient, in SPSS13. Results: Most of the nurses were female, married, educated at BS degree and 86.4% of them were aware of Ethic codes also 91.9% of nurses and 41.8% of patients represented nurses respect ethic codes. Nurses' and patients' perspective about ethic codes differed significantly. Significant relationship was found between nurses' knowledge of ethic codes and job satisfaction and complaint of ethical performance. Conclusion: According to the results, consideration to teaching ethic codes in nursing curriculum for student and continuous education for staff is proposed, on the other hand recognizing failures of the health system, optimizing nursing care, attempt to inform patients about Nursing ethic codes, promote patient rights and achieve patient satisfaction can minimize the differences between the two perspectives. PMID:25276730

  5. A cascaded coding scheme for error control

    NASA Technical Reports Server (NTRS)

    Kasami, T.; Lin, S.

    1985-01-01

    A cascaded coding scheme for error control was investigated. The scheme employs a combination of hard and soft decisions in decoding. Error performance is analyzed. If the inner and outer codes are chosen properly, extremely high reliability can be attained even for a high channel bit-error-rate. Some example schemes are studied which seem to be quite suitable for satellite down-link error control.

  6. Robust Nonlinear Neural Codes

    NASA Astrophysics Data System (ADS)

    Yang, Qianli; Pitkow, Xaq

    2015-03-01

    Most interesting natural sensory stimuli are encoded in the brain in a form that can only be decoded nonlinearly. But despite being a core function of the brain, nonlinear population codes are rarely studied and poorly understood. Interestingly, the few existing models of nonlinear codes are inconsistent with known architectural features of the brain. In particular, these codes have information content that scales with the size of the cortical population, even if that violates the data processing inequality by exceeding the amount of information entering the sensory system. Here we provide a valid theory of nonlinear population codes by generalizing recent work on information-limiting correlations in linear population codes. Although these generalized, nonlinear information-limiting correlations bound the performance of any decoder, they also make decoding more robust to suboptimal computation, allowing many suboptimal decoders to achieve nearly the same efficiency as an optimal decoder. Although these correlations are extremely difficult to measure directly, particularly for nonlinear codes, we provide a simple, practical test by which one can use choice-related activity in small populations of neurons to determine whether decoding is suboptimal or optimal and limited by correlated noise. We conclude by describing an example computation in the vestibular system where this theory applies. QY and XP was supported by a grant from the McNair foundation.

  7. Ideal Binocular Disparity Detectors Learned Using Independent Subspace Analysis on Binocular Natural Image Pairs

    PubMed Central

    Hunter, David W.; Hibbard, Paul B.

    2016-01-01

    An influential theory of mammalian vision, known as the efficient coding hypothesis, holds that early stages in the visual cortex attempts to form an efficient coding of ecologically valid stimuli. Although numerous authors have successfully modelled some aspects of early vision mathematically, closer inspection has found substantial discrepancies between the predictions of some of these models and observations of neurons in the visual cortex. In particular analysis of linear-non-linear models of simple-cells using Independent Component Analysis has found a strong bias towards features on the horoptor. In order to investigate the link between the information content of binocular images, mathematical models of complex cells and physiological recordings, we applied Independent Subspace Analysis to binocular image patches in order to learn a set of complex-cell-like models. We found that these complex-cell-like models exhibited a wide range of binocular disparity-discriminability, although only a minority exhibited high binocular discrimination scores. However, in common with the linear-non-linear model case we found that feature detection was limited to the horoptor suggesting that current mathematical models are limited in their ability to explain the functionality of the visual cortex. PMID:26982184

  8. Marketing Handbook for Independent Schools.

    ERIC Educational Resources Information Center

    Boarding Schools, Boston, MA.

    This publication is a resource to help independent schools attract more familites to their institutions and to increase the voluntary support by the larger community surrounding the school. The first chapter attempts to dispel misconceptions, define pertinent marketing terms, and relate their importance to independent schools. The rest of the book…

  9. Independent Learning Models: A Comparison.

    ERIC Educational Resources Information Center

    Wickett, R. E. Y.

    Five models of independent learning are suitable for use in adult education programs. The common factor is a facilitator who works in some way with the student in the learning process. They display different characteristics, including the extent of independence in relation to content and/or process. Nondirective tutorial instruction and learning…

  10. Reversibility and efficiency in coding protein information.

    PubMed

    Tamir, Boaz; Priel, Avner

    2010-12-21

    Why the genetic code has a fixed length? Protein information is transferred by coding each amino acid using codons whose length equals 3 for all amino acids. Hence the most probable and the least probable amino acid get a codeword with an equal length. Moreover, the distributions of amino acids found in nature are not uniform and therefore the efficiency of such codes is sub-optimal. The origins of these apparently non-efficient codes are yet unclear. In this paper we propose an a priori argument for the energy efficiency of such codes resulting from their reversibility, in contrast to their time inefficiency. Such codes are reversible in the sense that a primitive processor, reading three letters in each step, can always reverse its operation, undoing its process. We examine the codes for the distributions of amino acids that exist in nature and show that they could not be both time efficient and reversible. We investigate a family of Zipf-type distributions and present their efficient (non-fixed length) prefix code, their graphs, and the condition for their reversibility. We prove that for a large family of such distributions, if the code is time efficient, it could not be reversible. In other words, if pre-biotic processes demand reversibility, the protein code could not be time efficient. The benefits of reversibility are clear: reversible processes are adiabatic, namely, they dissipate a very small amount of energy. Such processes must be done slowly enough; therefore time efficiency is non-important. It is reasonable to assume that early biochemical complexes were more prone towards energy efficiency, where forward and backward processes were almost symmetrical. PMID:20868696

  11. Code Seal v 1.0

    SciTech Connect

    Chavez, Adrian; & Anderson, William

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is an active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation

  12. Spherical hashing: binary code embedding with hyperspheres.

    PubMed

    Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui

    2015-11-01

    Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement. PMID:26440269

  13. Code Seal v 1.0

    Energy Science and Technology Software Center (ESTSC)

    2009-12-11

    CodeSeal is a Sandia National Laboratories developed technology that provides a means of securely obfuscating finite state machines in a mathematically provable way. The technology was developed in order to provide a solution for anti-reverse engineering, assured execution, and integrity of execution. CodeSeal accomplishes these goals with the addition of the concept of a trust anchor, a small piece of trust integrated into the system, to the model of code obfuscation. Code obfuscation is anmore » active area of academic research, but most findings have merely demonstrated that general obfuscation is impossible. By modifying the security model such that we may rely on the presence of a small, tamper-protected device, however, Sandia has developed an effective method for obfuscating code. An open publication describing the technology in more detail can be found at http://eprint.iacr.org/2008/184.pdf.Independent Software/Hardware monitors, Use control, Supervisory Control And Data Acquisition (SCADA), Algorithm obfuscation« less

  14. Cotranslational signal-independent SRP preloading during membrane targeting.

    PubMed

    Chartron, Justin W; Hunt, Katherine C L; Frydman, Judith

    2016-08-11

    Ribosome-associated factors must properly decode the limited information available in nascent polypeptides to direct them to their correct cellular fate. It is unclear how the low complexity information exposed by the nascent chain suffices for accurate recognition by the many factors competing for the limited surface near the ribosomal exit site. Questions remain even for the well-studied cotranslational targeting cycle to the endoplasmic reticulum, involving recognition of linear hydrophobic signal sequences or transmembrane domains by the signal recognition particle (SRP). Notably, the SRP has low abundance relative to the large number of ribosome-nascent-chain complexes (RNCs), yet it accurately selects those destined for the endoplasmic reticulum. Despite their overlapping specificities, the SRP and the cotranslationally acting Hsp70 display precise mutually exclusive selectivity in vivo for their cognate RNCs. To understand cotranslational nascent chain recognition in vivo, here we investigate the cotranslational membrane-targeting cycle using ribosome profiling in yeast cells coupled with biochemical fractionation of ribosome populations. We show that the SRP preferentially binds secretory RNCs before their targeting signals are translated. Non-coding mRNA elements can promote this signal-independent pre-recruitment of SRP. Our study defines the complex kinetic interaction between elongation in the cytosol and determinants in the polypeptide and mRNA that modulate SRP–substrate selection and membrane targeting. PMID:27487213

  15. Teacher Evaluation in Independent Schools. An Empirical Investigation.

    ERIC Educational Resources Information Center

    Cookson, Peter W., Jr.

    1980-01-01

    Traditionally informal, teacher evaluation in the private school sector is changing. Although faculty and administrators view the situation from different perspectives, there is agreement that more formal methods are more satisfactory. Boarding schools differ significantly from day schools in their approach to teacher evaluation. (SB)

  16. INVESTIGATION OF FISCALLY INDEPENDENT AND DEPENDENT CITY SCHOOL DISTRICTS.

    ERIC Educational Resources Information Center

    GITTELL, MARILYN; AND OTHERS

    A TWO-PART COMPARATIVE ANALYSIS IS MADE OF LARGE AND SMALL CITY SCHOOL SYSTEMS. PART I ANALYZES A WIDE RANGE OF FISCAL AND NON-FISCAL VARIABLES ASSOCIATED WITH FISCAL STATUS OF CITY SCHOOL SYSTEMS. IT COVERS THE 2,788 CITY SCHOOL DISTRICTS IN THE UNITED STATES WITH ENROLLMENTS OVER 3,000. COMPLEX INTERRELATIONSHIPS SURROUNDING FISCAL STATUS IN…

  17. Multiplexed quantification for data-independent acquisition.

    PubMed

    Minogue, Catherine E; Hebert, Alexander S; Rensvold, Jarred W; Westphall, Michael S; Pagliarini, David J; Coon, Joshua J

    2015-03-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS(2) spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS(1)-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  18. Multiplexed Quantification for Data-Independent Acquisition

    PubMed Central

    Minogue, Catherine E.; Hebert, Alexander S.; Rensvold, Jarred W.; Westphall, Michael S.; Pagliarini, David J.; Coon, Joshua J.

    2015-01-01

    Data-independent acquisition (DIA) strategies provide a sensitive and reproducible alternative to data-dependent acquisition (DDA) methods for large-scale quantitative proteomic analyses. Unfortunately, DIA methods suffer from incompatibility with common multiplexed quantification methods, specifically stable isotope labeling approaches such as isobaric tags and stable isotope labeling of amino acids in cell culture (SILAC). Here we expand the use of neutron-encoded (NeuCode) SILAC to DIA applications (NeuCoDIA), producing a strategy that enables multiplexing within DIA scans without further convoluting the already complex MS2 spectra. We demonstrate duplex NeuCoDIA analysis of both mixed-ratio (1:1 and 10:1) yeast and mouse embryo myogenesis proteomes. Analysis of the mixed-ratio yeast samples revealed the strong accuracy and precision of our NeuCoDIA method, both of which were comparable to our established MS1-based quantification approach. NeuCoDIA also uncovered the dynamic protein changes that occur during myogenic differentiation, demonstrating the feasibility of this methodology for biological applications. We consequently establish DIA quantification of NeuCode SILAC as a useful and practical alternative to DDA-based approaches. PMID:25621425

  19. Prioritized LT Codes

    NASA Technical Reports Server (NTRS)

    Woo, Simon S.; Cheng, Michael K.

    2011-01-01

    The original Luby Transform (LT) coding scheme is extended to account for data transmissions where some information symbols in a message block are more important than others. Prioritized LT codes provide unequal error protection (UEP) of data on an erasure channel by modifying the original LT encoder. The prioritized algorithm improves high-priority data protection without penalizing low-priority data recovery. Moreover, low-latency decoding is also obtained for high-priority data due to fast encoding. Prioritized LT codes only require a slight change in the original encoding algorithm, and no changes at all at the decoder. Hence, with a small complexity increase in the LT encoder, an improved UEP and low-decoding latency performance for high-priority data can be achieved. LT encoding partitions a data stream into fixed-sized message blocks each with a constant number of information symbols. To generate a code symbol from the information symbols in a message, the Robust-Soliton probability distribution is first applied in order to determine the number of information symbols to be used to compute the code symbol. Then, the specific information symbols are chosen uniform randomly from the message block. Finally, the selected information symbols are XORed to form the code symbol. The Prioritized LT code construction includes an additional restriction that code symbols formed by a relatively small number of XORed information symbols select some of these information symbols from the pool of high-priority data. Once high-priority data are fully covered, encoding continues with the conventional LT approach where code symbols are generated by selecting information symbols from the entire message block including all different priorities. Therefore, if code symbols derived from high-priority data experience an unusual high number of erasures, Prioritized LT codes can still reliably recover both high- and low-priority data. This hybrid approach decides not only "how to encode

  20. Coded source neutron imaging

    SciTech Connect

    Bingham, Philip R; Santos-Villalobos, Hector J

    2011-01-01

    Coded aperture techniques have been applied to neutron radiography to address limitations in neutron flux and resolution of neutron detectors in a system labeled coded source imaging (CSI). By coding the neutron source, a magnified imaging system is designed with small spot size aperture holes (10 and 100 m) for improved resolution beyond the detector limits and with many holes in the aperture (50% open) to account for flux losses due to the small pinhole size. An introduction to neutron radiography and coded aperture imaging is presented. A system design is developed for a CSI system with a development of equations for limitations on the system based on the coded image requirements and the neutron source characteristics of size and divergence. Simulation has been applied to the design using McStas to provide qualitative measures of performance with simulations of pinhole array objects followed by a quantitative measure through simulation of a tilted edge and calculation of the modulation transfer function (MTF) from the line spread function. MTF results for both 100um and 10um aperture hole diameters show resolutions matching the hole diameters.

  1. Error coding simulations

    NASA Technical Reports Server (NTRS)

    Noble, Viveca K.

    1993-01-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  2. Error coding simulations

    NASA Astrophysics Data System (ADS)

    Noble, Viveca K.

    1993-11-01

    There are various elements such as radio frequency interference (RFI) which may induce errors in data being transmitted via a satellite communication link. When a transmission is affected by interference or other error-causing elements, the transmitted data becomes indecipherable. It becomes necessary to implement techniques to recover from these disturbances. The objective of this research is to develop software which simulates error control circuits and evaluate the performance of these modules in various bit error rate environments. The results of the evaluation provide the engineer with information which helps determine the optimal error control scheme. The Consultative Committee for Space Data Systems (CCSDS) recommends the use of Reed-Solomon (RS) and convolutional encoders and Viterbi and RS decoders for error correction. The use of forward error correction techniques greatly reduces the received signal to noise needed for a certain desired bit error rate. The use of concatenated coding, e.g. inner convolutional code and outer RS code, provides even greater coding gain. The 16-bit cyclic redundancy check (CRC) code is recommended by CCSDS for error detection.

  3. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In this research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing, a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RM subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a high data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study, which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating, and present some of the key architectural approaches being used to

  4. High-Speed Soft-Decision Decoding of Two Reed-Muller Codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Uehara, Gregory T.

    1996-01-01

    In his research, we have proposed the (64, 40, 8) subcode of the third-order Reed-Muller (RM) code to NASA for high-speed satellite communications. This RM subcode can be used either alone or as an inner code of a concatenated coding system with the NASA standard (255, 233, 33) Reed-Solomon (RS) code as the outer code to achieve high performance (or low bit-error rate) with reduced decoding complexity. It can also be used as a component code in a multilevel bandwidth efficient coded modulation system to achieve reliable bandwidth efficient data transmission. This report will summarize the key progress we have made toward achieving our eventual goal of implementing a decoder system based upon this code. In the first phase of study, we investigated the complexities of various sectionalized trellis diagrams for the proposed (64, 40, 8) RNI subcode. We found a specific 8-trellis diagram for this code which requires the least decoding complexity with a high possibility of achieving a decoding speed of 600 M bits per second (Mbps). The combination of a large number of states and a hi ch data rate will be made possible due to the utilization of a high degree of parallelism throughout the architecture. This trellis diagram will be presented and briefly described. In the second phase of study which was carried out through the past year, we investigated circuit architectures to determine the feasibility of VLSI implementation of a high-speed Viterbi decoder based on this 8-section trellis diagram. We began to examine specific design and implementation approaches to implement a fully custom integrated circuit (IC) which will be a key building block for a decoder system implementation. The key results will be presented in this report. This report will be divided into three primary sections. First, we will briefly describe the system block diagram in which the proposed decoder is assumed to be operating and present some of the key architectural approaches being used to

  5. Temporal Coding of Volumetric Imagery

    NASA Astrophysics Data System (ADS)

    Llull, Patrick Ryan

    'Image volumes' refer to realizations of images in other dimensions such as time, spectrum, and focus. Recent advances in scientific, medical, and consumer applications demand improvements in image volume capture. Though image volume acquisition continues to advance, it maintains the same sampling mechanisms that have been used for decades; every voxel must be scanned and is presumed independent of its neighbors. Under these conditions, improving performance comes at the cost of increased system complexity, data rates, and power consumption. This dissertation explores systems and methods capable of efficiently improving sensitivity and performance for image volume cameras, and specifically proposes several sampling strategies that utilize temporal coding to improve imaging system performance and enhance our awareness for a variety of dynamic applications. Video cameras and camcorders sample the video volume (x,y,t) at fixed intervals to gain understanding of the volume's temporal evolution. Conventionally, one must reduce the spatial resolution to increase the framerate of such cameras. Using temporal coding via physical translation of an optical element known as a coded aperture, the compressive temporal imaging (CACTI) camera emonstrates a method which which to embed the temporal dimension of the video volume into spatial (x,y) measurements, thereby greatly improving temporal resolution with minimal loss of spatial resolution. This technique, which is among a family of compressive sampling strategies developed at Duke University, temporally codes the exposure readout functions at the pixel level. Since video cameras nominally integrate the remaining image volume dimensions (e.g. spectrum and focus) at capture time, spectral (x,y,t,lambda) and focal (x,y,t,z) image volumes are traditionally captured via sequential changes to the spectral and focal state of the system, respectively. The CACTI camera's ability to embed video volumes into images leads to exploration

  6. Identifying personal microbiomes using metagenomic codes

    PubMed Central

    Franzosa, Eric A.; Huang, Katherine; Meadow, James F.; Gevers, Dirk; Lemon, Katherine P.; Bohannan, Brendan J. M.; Huttenhower, Curtis

    2015-01-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30–300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability—a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  7. On Coding Non-Contiguous Letter Combinations

    PubMed Central

    Dandurand, Frédéric; Grainger, Jonathan; Duñabeitia, Jon Andoni; Granier, Jean-Pierre

    2011-01-01

    Starting from the hypothesis that printed word identification initially involves the parallel mapping of visual features onto location-specific letter identities, we analyze the type of information that would be involved in optimally mapping this location-specific orthographic code onto a location-invariant lexical code. We assume that some intermediate level of coding exists between individual letters and whole words, and that this involves the representation of letter combinations. We then investigate the nature of this intermediate level of coding given the constraints of optimality. This intermediate level of coding is expected to compress data while retaining as much information as possible about word identity. Information conveyed by letters is a function of how much they constrain word identity and how visible they are. Optimization of this coding is a combination of minimizing resources (using the most compact representations) and maximizing information. We show that in a large proportion of cases, non-contiguous letter sequences contain more information than contiguous sequences, while at the same time requiring less precise coding. Moreover, we found that the best predictor of human performance in orthographic priming experiments was within-word ranking of conditional probabilities, rather than average conditional probabilities. We conclude that from an optimality perspective, readers learn to select certain contiguous and non-contiguous letter combinations as information that provides the best cue to word identity. PMID:21734901

  8. Identifying personal microbiomes using metagenomic codes.

    PubMed

    Franzosa, Eric A; Huang, Katherine; Meadow, James F; Gevers, Dirk; Lemon, Katherine P; Bohannan, Brendan J M; Huttenhower, Curtis

    2015-06-01

    Community composition within the human microbiome varies across individuals, but it remains unknown if this variation is sufficient to uniquely identify individuals within large populations or stable enough to identify them over time. We investigated this by developing a hitting set-based coding algorithm and applying it to the Human Microbiome Project population. Our approach defined body site-specific metagenomic codes: sets of microbial taxa or genes prioritized to uniquely and stably identify individuals. Codes capturing strain variation in clade-specific marker genes were able to distinguish among 100s of individuals at an initial sampling time point. In comparisons with follow-up samples collected 30-300 d later, ∼30% of individuals could still be uniquely pinpointed using metagenomic codes from a typical body site; coincidental (false positive) matches were rare. Codes based on the gut microbiome were exceptionally stable and pinpointed >80% of individuals. The failure of a code to match its owner at a later time point was largely explained by the loss of specific microbial strains (at current limits of detection) and was only weakly associated with the length of the sampling interval. In addition to highlighting patterns of temporal variation in the ecology of the human microbiome, this work demonstrates the feasibility of microbiome-based identifiability-a result with important ethical implications for microbiome study design. The datasets and code used in this work are available for download from huttenhower.sph.harvard.edu/idability. PMID:25964341

  9. Phase-coded pulse aperiodic transmitter coding

    NASA Astrophysics Data System (ADS)

    Virtanen, I. I.; Vierinen, J.; Lehtinen, M. S.

    2009-07-01

    Both ionospheric and weather radar communities have already adopted the method of transmitting radar pulses in an aperiodic manner when measuring moderately overspread targets. Among the users of the ionospheric radars, this method is called Aperiodic Transmitter Coding (ATC), whereas the weather radar users have adopted the term Simultaneous Multiple Pulse-Repetition Frequency (SMPRF). When probing the ionosphere at the carrier frequencies of the EISCAT Incoherent Scatter Radar facilities, the range extent of the detectable target is typically of the order of one thousand kilometers - about seven milliseconds - whereas the characteristic correlation time of the scattered signal varies from a few milliseconds in the D-region to only tens of microseconds in the F-region. If one is interested in estimating the scattering autocorrelation function (ACF) at time lags shorter than the F-region correlation time, the D-region must be considered as a moderately overspread target, whereas the F-region is a severely overspread one. Given the technical restrictions of the radar hardware, a combination of ATC and phase-coded long pulses is advantageous for this kind of target. We evaluate such an experiment under infinitely low signal-to-noise ratio (SNR) conditions using lag profile inversion. In addition, a qualitative evaluation under high-SNR conditions is performed by analysing simulated data. The results show that an acceptable estimation accuracy and a very good lag resolution in the D-region can be achieved with a pulse length long enough for simultaneous E- and F-region measurements with a reasonable lag extent. The new experiment design is tested with the EISCAT Tromsø VHF (224 MHz) radar. An example of a full D/E/F-region ACF from the test run is shown at the end of the paper.

  10. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, S.K.

    1998-03-24

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example, the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei. 25 figs.

  11. Biographical factors of occupational independence.

    PubMed

    Müller, G F

    2001-10-01

    The present study examined biographical factors of occupational independence including any kind of nonemployed profession. Participants were 59 occupationally independent and 58 employed persons of different age (M = 36.3 yr.), sex, and profession. They were interviewed on variables like family influence, educational background, occupational role models, and critical events for choosing a particular type of occupational career. The obtained results show that occupationally independent people reported stronger family ties, experienced fewer restrictions of formal education, and remembered fewer negative role models than the employed people. Implications of these results are discussed. PMID:11783553

  12. Sequence independent amplification of DNA

    DOEpatents

    Bohlander, Stefan K.

    1998-01-01

    The present invention is a rapid sequence-independent amplification procedure (SIA). Even minute amounts of DNA from various sources can be amplified independent of any sequence requirements of the DNA or any a priori knowledge of any sequence characteristics of the DNA to be amplified. This method allows, for example the sequence independent amplification of microdissected chromosomal material and the reliable construction of high quality fluorescent in situ hybridization (FISH) probes from YACs or from other sources. These probes can be used to localize YACs on metaphase chromosomes but also--with high efficiency--in interphase nuclei.

  13. FAA Smoke Transport Code

    Energy Science and Technology Software Center (ESTSC)

    2006-10-27

    FAA Smoke Transport Code, a physics-based Computational Fluid Dynamics tool, which couples heat, mass, and momentum transfer, has been developed to provide information on smoke transport in cargo compartments with various geometries and flight conditions. The software package contains a graphical user interface for specification of geometry and boundary conditions, analysis module for solving the governing equations, and a post-processing tool. The current code was produced by making substantial improvements and additions to a codemore » obtained from a university. The original code was able to compute steady, uniform, isothermal turbulent pressurization. In addition, a preprocessor and postprocessor were added to arrive at the current software package.« less

  14. Code query by example

    NASA Astrophysics Data System (ADS)

    Vaucouleur, Sebastien

    2011-02-01

    We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.

  15. Code inspection instructional validation

    NASA Technical Reports Server (NTRS)

    Orr, Kay; Stancil, Shirley

    1992-01-01

    The Shuttle Data Systems Branch (SDSB) of the Flight Data Systems Division (FDSD) at Johnson Space Center contracted with Southwest Research Institute (SwRI) to validate the effectiveness of an interactive video course on the code inspection process. The purpose of this project was to determine if this course could be effective for teaching NASA analysts the process of code inspection. In addition, NASA was interested in the effectiveness of this unique type of instruction (Digital Video Interactive), for providing training on software processes. This study found the Carnegie Mellon course, 'A Cure for the Common Code', effective for teaching the process of code inspection. In addition, analysts prefer learning with this method of instruction, or this method in combination with other methods. As is, the course is definitely better than no course at all; however, findings indicate changes are needed. Following are conclusions of this study. (1) The course is instructionally effective. (2) The simulation has a positive effect on student's confidence in his ability to apply new knowledge. (3) Analysts like the course and prefer this method of training, or this method in combination with current methods of training in code inspection, over the way training is currently being conducted. (4) Analysts responded favorably to information presented through scenarios incorporating full motion video. (5) Some course content needs to be changed. (6) Some content needs to be added to the course. SwRI believes this study indicates interactive video instruction combined with simulation is effective for teaching software processes. Based on the conclusions of this study, SwRI has outlined seven options for NASA to consider. SwRI recommends the option which involves creation of new source code and data files, but uses much of the existing content and design from the current course. Although this option involves a significant software development effort, SwRI believes this option

  16. Aeroacoustic Prediction Codes

    NASA Technical Reports Server (NTRS)

    Gliebe, P; Mani, R.; Shin, H.; Mitchell, B.; Ashford, G.; Salamah, S.; Connell, S.; Huff, Dennis (Technical Monitor)

    2000-01-01

    This report describes work performed on Contract NAS3-27720AoI 13 as part of the NASA Advanced Subsonic Transport (AST) Noise Reduction Technology effort. Computer codes were developed to provide quantitative prediction, design, and analysis capability for several aircraft engine noise sources. The objective was to provide improved, physics-based tools for exploration of noise-reduction concepts and understanding of experimental results. Methods and codes focused on fan broadband and 'buzz saw' noise and on low-emissions combustor noise and compliment work done by other contractors under the NASA AST program to develop methods and codes for fan harmonic tone noise and jet noise. The methods and codes developed and reported herein employ a wide range of approaches, from the strictly empirical to the completely computational, with some being semiempirical analytical, and/or analytical/computational. Emphasis was on capturing the essential physics while still considering method or code utility as a practical design and analysis tool for everyday engineering use. Codes and prediction models were developed for: (1) an improved empirical correlation model for fan rotor exit flow mean and turbulence properties, for use in predicting broadband noise generated by rotor exit flow turbulence interaction with downstream stator vanes: (2) fan broadband noise models for rotor and stator/turbulence interaction sources including 3D effects, noncompact-source effects. directivity modeling, and extensions to the rotor supersonic tip-speed regime; (3) fan multiple-pure-tone in-duct sound pressure prediction methodology based on computational fluid dynamics (CFD) analysis; and (4) low-emissions combustor prediction methodology and computer code based on CFD and actuator disk theory. In addition. the relative importance of dipole and quadrupole source mechanisms was studied using direct CFD source computation for a simple cascadeigust interaction problem, and an empirical combustor

  17. Securing mobile code.

    SciTech Connect

    Link, Hamilton E.; Schroeppel, Richard Crabtree; Neumann, William Douglas; Campbell, Philip LaRoche; Beaver, Cheryl Lynn; Pierson, Lyndon George; Anderson, William Erik

    2004-10-01

    If software is designed so that the software can issue functions that will move that software from one computing platform to another, then the software is said to be 'mobile'. There are two general areas of security problems associated with mobile code. The 'secure host' problem involves protecting the host from malicious mobile code. The 'secure mobile code' problem, on the other hand, involves protecting the code from malicious hosts. This report focuses on the latter problem. We have found three distinct camps of opinions regarding how to secure mobile code. There are those who believe special distributed hardware is necessary, those who believe special distributed software is necessary, and those who believe neither is necessary. We examine all three camps, with a focus on the third. In the distributed software camp we examine some commonly proposed techniques including Java, D'Agents and Flask. For the specialized hardware camp, we propose a cryptographic technique for 'tamper-proofing' code over a large portion of the software/hardware life cycle by careful modification of current architectures. This method culminates by decrypting/authenticating each instruction within a physically protected CPU, thereby protecting against subversion by malicious code. Our main focus is on the camp that believes that neither specialized software nor hardware is necessary. We concentrate on methods of code obfuscation to render an entire program or a data segment on which a program depends incomprehensible. The hope is to prevent or at least slow down reverse engineering efforts and to prevent goal-oriented attacks on the software and execution. The field of obfuscation is still in a state of development with the central problem being the lack of a basis for evaluating the protection schemes. We give a brief introduction to some of the main ideas in the field, followed by an in depth analysis of a technique called 'white-boxing'. We put forth some new attacks and improvements

  18. Numerical MHD codes for modeling astrophysical flows

    NASA Astrophysics Data System (ADS)

    Koldoba, A. V.; Ustyugova, G. V.; Lii, P. S.; Comins, M. L.; Dyda, S.; Romanova, M. M.; Lovelace, R. V. E.

    2016-05-01

    We describe a Godunov-type magnetohydrodynamic (MHD) code based on the Miyoshi and Kusano (2005) solver which can be used to solve various astrophysical hydrodynamic and MHD problems. The energy equation is in the form of entropy conservation. The code has been implemented on several different coordinate systems: 2.5D axisymmetric cylindrical coordinates, 2D Cartesian coordinates, 2D plane polar coordinates, and fully 3D cylindrical coordinates. Viscosity and diffusivity are implemented in the code to control the accretion rate in the disk and the rate of penetration of the disk matter through the magnetic field lines. The code has been utilized for the numerical investigations of a number of different astrophysical problems, several examples of which are shown.

  19. Perceptually-Based Adaptive JPEG Coding

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Rosenholtz, Ruth; Null, Cynthia H. (Technical Monitor)

    1996-01-01

    An extension to the JPEG standard (ISO/IEC DIS 10918-3) allows spatial adaptive coding of still images. As with baseline JPEG coding, one quantization matrix applies to an entire image channel, but in addition the user may specify a multiplier for each 8 x 8 block, which multiplies the quantization matrix, yielding the new matrix for the block. MPEG 1 and 2 use much the same scheme, except there the multiplier changes only on macroblock boundaries. We propose a method for perceptual optimization of the set of multipliers. We compute the perceptual error for each block based upon DCT quantization error adjusted according to contrast sensitivity, light adaptation, and contrast masking, and pick the set of multipliers which yield maximally flat perceptual error over the blocks of the image. We investigate the bitrate savings due to this adaptive coding scheme and the relative importance of the different sorts of masking on adaptive coding.

  20. Modular optimization code package: MOZAIK

    NASA Astrophysics Data System (ADS)

    Bekar, Kursat B.

    This dissertation addresses the development of a modular optimization code package, MOZAIK, for geometric shape optimization problems in nuclear engineering applications. MOZAIK's first mission, determining the optimal shape of the D2O moderator tank for the current and new beam tube configurations for the Penn State Breazeale Reactor's (PSBR) beam port facility, is used to demonstrate its capabilities and test its performance. MOZAIK was designed as a modular optimization sequence including three primary independent modules: the initializer, the physics and the optimizer, each having a specific task. By using fixed interface blocks among the modules, the code attains its two most important characteristics: generic form and modularity. The benefit of this modular structure is that the contents of the modules can be switched depending on the requirements of accuracy, computational efficiency, or compatibility with the other modules. Oak Ridge National Laboratory's discrete ordinates transport code TORT was selected as the transport solver in the physics module of MOZAIK, and two different optimizers, Min-max and Genetic Algorithms (GA), were implemented in the optimizer module of the code package. A distributed memory parallelism was also applied to MOZAIK via MPI (Message Passing Interface) to execute the physics module concurrently on a number of processors for various states in the same search. Moreover, dynamic scheduling was enabled to enhance load balance among the processors while running MOZAIK's physics module thus improving the parallel speedup and efficiency. In this way, the total computation time consumed by the physics module is reduced by a factor close to M, where M is the number of processors. This capability also encourages the use of MOZAIK for shape optimization problems in nuclear applications because many traditional codes related to radiation transport do not have parallel execution capability. A set of computational models based on the

  1. Independent Schools: Landscape and Learnings.

    ERIC Educational Resources Information Center

    Oates, William A.

    1981-01-01

    Examines American independent schools (parochial, southern segregated, and private institutions) in terms of their funding, expenditures, changing enrollment patterns, teacher-student ratios, and societal functions. Journal available from Daedalus Subscription Department, 1172 Commonwealth Ave., Boston, MA 02132. (AM)

  2. Whatever Happened to Independent Inventors?

    ERIC Educational Resources Information Center

    Douglas, John H.

    1976-01-01

    Discuss the increasing problems, facing the private innovator, which may seriously affect the progress of American technology. Statistics show a decrease in patents issued to independent inventors. Information concerning patents, suggested cautions, and useful publications are cited. (Author/EB)

  3. Technology for Independent Living: Sourcebook.

    ERIC Educational Resources Information Center

    Enders, Alexandra, Ed.

    This sourcebook provides information for the practical implementation of independent living technology in the everyday rehabilitation process. "Information Services and Resources" lists databases, clearinghouses, networks, research and development programs, toll-free telephone numbers, consumer protection caveats, selected publications, and…

  4. Benchmark study between FIDAP and a cellular automata code

    SciTech Connect

    Akau, R.L.; Stockman, H.W.

    1991-01-01

    A fluid flow benchmark exercise was conducted to compare results between a cellular automata code and FIDAP. Cellular automata codes are free from gridding constraints, and are generally used to model slow (Reynolds number {approx} 1) flows around complex solid obstacles. However, the accuracy of cellular automata codes at higher Reynolds numbers, where inertial terms are significant, is not well-documented. In order to validate the cellular automata code, two fluids problems were investigated. For both problems, flow was assumed to be laminar, two-dimensional, isothermal, incompressible and periodic. Results showed that the cellular automata code simulated the overall behavior of the flow field. 7 refs., 12 figs.

  5. Constrained Coding for the Deep-Space Optical Channel

    NASA Astrophysics Data System (ADS)

    Moision, B.; Hamkins, J.

    2002-01-01

    We investigate methods of coding for a channel subject to a large dead-time constraint, i.e., a constraint on the minimum spacing between transmitted pulses, with the deep-space optical channel as the motivating example. Several constrained codes designed to satisfy the dead-time constraint are considered and compared on the basis of throughput, complexity, and decoded error rate. The performance of an iteratively decoded serial concatenation of a constrained code with an outer code is evaluated and shown to provide significant gains over a Reed-Solomon code concatenated with pulse-position modulation.

  6. Refractoriness enhances temporal coding by auditory nerve fibers.

    PubMed

    Avissar, Michael; Wittig, John H; Saunders, James C; Parsons, Thomas D

    2013-05-01

    A universal property of spiking neurons is refractoriness, a transient decrease in discharge probability immediately following an action potential (spike). The refractory period lasts only one to a few milliseconds, but has the potential to affect temporal coding of acoustic stimuli by auditory neurons, which are capable of submillisecond spike-time precision. Here this possibility was investigated systematically by recording spike times from chicken auditory nerve fibers in vivo while stimulating with repeated pure tones at characteristic frequency. Refractory periods were tightly distributed, with a mean of 1.58 ms. A statistical model was developed to recapitulate each fiber's responses and then used to predict the effect of removing the refractory period on a cell-by-cell basis for two largely independent facets of temporal coding: faithful entrainment of interspike intervals to the stimulus frequency and precise synchronization of spike times to the stimulus phase. The ratio of the refractory period to the stimulus period predicted the impact of refractoriness on entrainment and synchronization. For ratios less than ∼0.9, refractoriness enhanced entrainment and this enhancement was often accompanied by an increase in spike-time precision. At higher ratios, little or no change in entrainment or synchronization was observed. Given the tight distribution of refractory periods, the ability of refractoriness to improve temporal coding is restricted to neurons responding to low-frequency stimuli. Enhanced encoding of low frequencies likely affects sound localization and pitch perception in the auditory system, as well as perception in nonauditory sensory modalities, because all spiking neurons exhibit refractoriness. PMID:23637161

  7. Enforcing the International Code of Marketing of Breast-milk Substitutes for Better Promotion of Exclusive Breastfeeding: Can Lessons Be Learned?

    PubMed

    Barennes, Hubert; Slesak, Guenther; Goyet, Sophie; Aaron, Percy; Srour, Leila M

    2016-02-01

    Exclusive breastfeeding, one of the best natural resources, needs protection and promotion. The International Code of Marketing of Breast-milk Substitutes (the Code), which aims to prevent the undermining of breastfeeding by formula advertising, faces implementation challenges. We reviewed frequently overlooked challenges and obstacles that the Code is facing worldwide, but particularly in Southeast Asia. Drawing lessons from various countries where we work, and following the example of successful public health interventions, we discussed legislation, enforcement, and experiences that are needed to successfully implement the Code. Successful holistic approaches that have strengthened the Code need to be scaled up. Community-based actions and peer-to-peer promotions have proved successful. Legislation without stringent enforcement and sufficient penalties is ineffective. The public needs education about the benefits and ways and means to support breastfeeding. It is crucial to combine strong political commitment and leadership with strict national regulations, definitions, and enforcement. National breastfeeding committees, with the authority to improve regulations, investigate violations, and enforce the laws, must be established. Systematic monitoring and reporting are needed to identify companies, individuals, intermediaries, and practices that infringe on the Code. Penalizing violators is crucial. Managers of multinational companies must be held accountable for international violations, and international legislative enforcement needs to be established. Further measures should include improved regulations to protect the breastfeeding mother: large-scale education campaigns; strong penalties for Code violators; exclusion of the formula industry from nutrition, education, and policy roles; supportive legal networks; and independent research of interventions supporting breastfeeding. PMID:26416439

  8. Effective Practice in the Design of Directed Independent Learning Opportunities

    ERIC Educational Resources Information Center

    Thomas, Liz; Jones, Robert; Ottaway, James

    2015-01-01

    This study, commissioned by the HEA and the QAA focuses on directed independent learning practices in UK higher education. It investigates what stakeholders (including academic staff and students) have found to be the most effective practices in the inception, design, quality assurance and enhancement of directed independent learning and explores…

  9. Accumulate Repeat Accumulate Coded Modulation

    NASA Technical Reports Server (NTRS)

    Abbasfar, Aliazam; Divsalar, Dariush; Yao, Kung

    2004-01-01

    In this paper we propose an innovative coded modulation scheme called 'Accumulate Repeat Accumulate Coded Modulation' (ARA coded modulation). This class of codes can be viewed as serial turbo-like codes, or as a subclass of Low Density Parity Check (LDPC) codes that are combined with high level modulation. Thus at the decoder belief propagation can be used for iterative decoding of ARA coded modulation on a graph, provided a demapper transforms the received in-phase and quadrature samples to reliability of the bits.

  10. Multiple trellis coded modulation

    NASA Technical Reports Server (NTRS)

    Simon, Marvin K. (Inventor); Divsalar, Dariush (Inventor)

    1990-01-01

    A technique for designing trellis codes to minimize bit error performance for a fading channel. The invention provides a criteria which may be used in the design of such codes which is significantly different from that used for average white Gaussian noise channels. The method of multiple trellis coded modulation of the present invention comprises the steps of: (a) coding b bits of input data into s intermediate outputs; (b) grouping said s intermediate outputs into k groups of s.sub.i intermediate outputs each where the summation of all s.sub.i,s is equal to s and k is equal to at least 2; (c) mapping each of said k groups of intermediate outputs into one of a plurality of symbols in accordance with a plurality of modulation schemes, one for each group such that the first group is mapped in accordance with a first modulation scheme and the second group is mapped in accordance with a second modulation scheme; and (d) outputting each of said symbols to provide k output symbols for each b bits of input data.

  11. Code of Ethics.

    ERIC Educational Resources Information Center

    American Sociological Association, Washington, DC.

    The American Sociological Association's code of ethics for sociologists is presented. For sociological research and practice, 10 requirements for ethical behavior are identified, including: maintaining objectivity and integrity; fully reporting findings and research methods, without omission of significant data; reporting fully all sources of…

  12. Sharing the Code.

    ERIC Educational Resources Information Center

    Olsen, Florence

    2003-01-01

    Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)

  13. Electrical Circuit Simulation Code

    Energy Science and Technology Software Center (ESTSC)

    2001-08-09

    Massively-Parallel Electrical Circuit Simulation Code. CHILESPICE is a massively-arallel distributed-memory electrical circuit simulation tool that contains many enhanced radiation, time-based, and thermal features and models. Large scale electronic circuit simulation. Shared memory, parallel processing, enhance convergence. Sandia specific device models.

  14. The Redox Code

    PubMed Central

    Jones, Dean P.

    2015-01-01

    Abstract Significance: The redox code is a set of principles that defines the positioning of the nicotinamide adenine dinucleotide (NAD, NADP) and thiol/disulfide and other redox systems as well as the thiol redox proteome in space and time in biological systems. The code is richly elaborated in an oxygen-dependent life, where activation/deactivation cycles involving O2 and H2O2 contribute to spatiotemporal organization for differentiation, development, and adaptation to the environment. Disruption of this organizational structure during oxidative stress represents a fundamental mechanism in system failure and disease. Recent Advances: Methodology in assessing components of the redox code under physiological conditions has progressed, permitting insight into spatiotemporal organization and allowing for identification of redox partners in redox proteomics and redox metabolomics. Critical Issues: Complexity of redox networks and redox regulation is being revealed step by step, yet much still needs to be learned. Future Directions: Detailed knowledge of the molecular patterns generated from the principles of the redox code under defined physiological or pathological conditions in cells and organs will contribute to understanding the redox component in health and disease. Ultimately, there will be a scientific basis to a modern redox medicine. Antioxid. Redox Signal. 23, 734–746. PMID:25891126

  15. Environmental Fluid Dynamics Code

    EPA Science Inventory

    The Environmental Fluid Dynamics Code (EFDC)is a state-of-the-art hydrodynamic model that can be used to simulate aquatic systems in one, two, and three dimensions. It has evolved over the past two decades to become one of the most widely used and technically defensible hydrodyn...

  16. Code of Ethics.

    ERIC Educational Resources Information Center

    Association of College Unions-International, Bloomington, IN.

    The code of ethics for the college union and student activities professional is presented by the Association of College Unions-International. The preamble identifies the objectives of the college union as providing campus community centers and social programs that enhance the quality of life for members of the academic community. Ethics for…

  17. Dual Coding in Children.

    ERIC Educational Resources Information Center

    Burton, John K.; Wildman, Terry M.

    The purpose of this study was to test the applicability of the dual coding hypothesis to children's recall performance. The hypothesis predicts that visual interference will have a small effect on the recall of visually presented words or pictures, but that acoustic interference will cause a decline in recall of visually presented words and…

  18. The revised genetic code

    NASA Astrophysics Data System (ADS)

    Ninio, Jacques

    1990-03-01

    Recent findings on the genetic code are reviewed, including selenocysteine usage, deviations in the assignments of sense and nonsense codons, RNA editing, natural ribosomal frameshifts and non-orthodox codon-anticodon pairings. A multi-stage codon reading process is presented.

  19. Dress Codes and Uniforms.

    ERIC Educational Resources Information Center

    Lumsden, Linda; Miller, Gabriel

    2002-01-01

    Students do not always make choices that adults agree with in their choice of school dress. Dress-code issues are explored in this Research Roundup, and guidance is offered to principals seeking to maintain a positive school climate. In "Do School Uniforms Fit?" Kerry White discusses arguments for and against school uniforms and summarizes the…

  20. Code Optimization Techniques

    SciTech Connect

    MAGEE,GLEN I.

    2000-08-03

    Computers transfer data in a number of different ways. Whether through a serial port, a parallel port, over a modem, over an ethernet cable, or internally from a hard disk to memory, some data will be lost. To compensate for that loss, numerous error detection and correction algorithms have been developed. One of the most common error correction codes is the Reed-Solomon code, which is a special subset of BCH (Bose-Chaudhuri-Hocquenghem) linear cyclic block codes. In the AURA project, an unmanned aircraft sends the data it collects back to earth so it can be analyzed during flight and possible flight modifications made. To counter possible data corruption during transmission, the data is encoded using a multi-block Reed-Solomon implementation with a possibly shortened final block. In order to maximize the amount of data transmitted, it was necessary to reduce the computation time of a Reed-Solomon encoding to three percent of the processor's time. To achieve such a reduction, many code optimization techniques were employed. This paper outlines the steps taken to reduce the processing time of a Reed-Solomon encoding and the insight into modern optimization techniques gained from the experience.

  1. FEFF5: An ab initio multiple scattering XAFS code

    SciTech Connect

    Rehr, J.J.; Zabinsky, S.I.

    1992-12-31

    FEFF5 is an efficient automated code which calculates multiple scattering (MS) curved wave XAFS spectra for molecules and solids. The theoretical ingredients and approximations contained in the code are revised, with the aim of describing the how XAFS spectra are efficiently simulated. The FEFF5 code consists of 4 independent modules: a scattering potential and phase shift module, a path finder module, a scattering amplitude module and an XAFS module. Multiple scattering Debye-Waller factors are built in using a correlated Debye model.

  2. Automated searching for quantum subsystem codes

    SciTech Connect

    Crosswhite, Gregory M.; Bacon, Dave

    2011-02-15

    Quantum error correction allows for faulty quantum systems to behave in an effectively error-free manner. One important class of techniques for quantum error correction is the class of quantum subsystem codes, which are relevant both to active quantum error-correcting schemes as well as to the design of self-correcting quantum memories. Previous approaches for investigating these codes have focused on applying theoretical analysis to look for interesting codes and to investigate their properties. In this paper we present an alternative approach that uses computational analysis to accomplish the same goals. Specifically, we present an algorithm that computes the optimal quantum subsystem code that can be implemented given an arbitrary set of measurement operators that are tensor products of Pauli operators. We then demonstrate the utility of this algorithm by performing a systematic investigation of the quantum subsystem codes that exist in the setting where the interactions are limited to two-body interactions between neighbors on lattices derived from the convex uniform tilings of the plane.

  3. Error resiliency of distributed video coding in wireless video communication

    NASA Astrophysics Data System (ADS)

    Ye, Shuiming; Ouaret, Mourad; Dufaux, Frederic; Ansorge, Michael; Ebrahimi, Touradj

    2008-08-01

    Distributed Video Coding (DVC) is a new paradigm in video coding, based on the Slepian-Wolf and Wyner-Ziv theorems. DVC offers a number of potential advantages: flexible partitioning of the complexity between the encoder and decoder, robustness to channel errors due to intrinsic joint source-channel coding, codec independent scalability, and multi-view coding without communications between the cameras. In this paper, we evaluate the performance of DVC in an error-prone wireless communication environment. We also present a hybrid spatial and temporal error concealment approach for DVC. Finally, we perform a comparison with a state-of-the-art AVC/H.264 video coding scheme in the presence of transmission errors.

  4. Two-dimensional MHD generator model. [GEN code

    SciTech Connect

    Geyer, H. K.; Ahluwalia, R. K.; Doss, E. D.

    1980-09-01

    A steady state, two-dimensional MHD generator code, GEN, is presented. The code solves the equations of conservation of mass, momentum, and energy, using a Von Mises transformation and a local linearization of the equations. By splitting the source terms into a part proportional to the axial pressure gradient and a part independent of the gradient, the pressure distribution along the channel is easily obtained to satisfy various criteria. Thus, the code can run effectively in both design modes, where the channel geometry is determined, and analysis modes, where the geometry is previously known. The code also employs a mixing length concept for turbulent flows, Cebeci and Chang's wall roughness model, and an extension of that model to the effective thermal diffusities. Results on code validation, as well as comparisons of skin friction and Stanton number calculations with experimental results, are presented.

  5. High rate concatenated coding systems using bandwidth efficient trellis inner codes

    NASA Astrophysics Data System (ADS)

    Deng, Robert H.; Costello, Daniel J., Jr.

    1989-05-01

    High-rate concatenated coding systems with bandwidth-efficient trellis inner codes and Reed-Solomon (RS) outer codes are investigated for application in high-speed satellite communication systems. Two concatenated coding schemes are proposed. In one the inner code is decoded with soft-decision Viterbi decoding, and the outer RS code performs error-correction-only decoding (decoding without side information). In the other, the inner code is decoded with a modified Viterbi algorithm, which produces reliability information along with the decoded output. In this algorithm, path metrics are used to estimate the entire information sequence, whereas branch metrics are used to provide reliability information on the decoded sequence. This information is used to erase unreliable bits in the decoded output. An errors-and-erasures RS decoder is then used for the outer code. The two schemes have been proposed for high-speed data communication on NASA satellite channels. The rates considered are at least double those used in current NASA systems, and the results indicate that high system reliability can still be achieved.

  6. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  7. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  8. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  9. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  10. 28 CFR 601.1 - Jurisdiction of the Independent Counsel: Iran/Contra.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...: Iran/Contra. 601.1 Section 601.1 Judicial Administration OFFICES OF INDEPENDENT COUNSEL, DEPARTMENT OF JUSTICE JURISDICTION OF THE INDEPENDENT COUNSEL: IRAN/CONTRA § 601.1 Jurisdiction of the Independent Counsel: Iran/Contra. (a) The Independent Counsel. Iran/Contra has jurisdiction to investigate to...

  11. High-Fidelity Coding with Correlated Neurons

    PubMed Central

    da Silveira, Rava Azeredo; Berry, Michael J.

    2014-01-01

    Positive correlations in the activity of neurons are widely observed in the brain. Previous studies have shown these correlations to be detrimental to the fidelity of population codes, or at best marginally favorable compared to independent codes. Here, we show that positive correlations can enhance coding performance by astronomical factors. Specifically, the probability of discrimination error can be suppressed by many orders of magnitude. Likewise, the number of stimuli encoded—the capacity—can be enhanced more than tenfold. These effects do not necessitate unrealistic correlation values, and can occur for populations with a few tens of neurons. We further show that both effects benefit from heterogeneity commonly seen in population activity. Error suppression and capacity enhancement rest upon a pattern of correlation. Tuning of one or several effective parameters can yield a limit of perfect coding: the corresponding pattern of positive correlation leads to a ‘lock-in’ of response probabilities that eliminates variability in the subspace relevant for stimulus discrimination. We discuss the nature of this pattern and we suggest experimental tests to identify it. PMID:25412463

  12. TACO: a finite element heat transfer code

    SciTech Connect

    Mason, W.E. Jr.

    1980-02-01

    TACO is a two-dimensional implicit finite element code for heat transfer analysis. It can perform both linear and nonlinear analyses and can be used to solve either transient or steady state problems. Either plane or axisymmetric geometries can be analyzed. TACO has the capability to handle time or temperature dependent material properties and materials may be either isotropic or orthotropic. A variety of time and temperature dependent loadings and boundary conditions are available including temperature, flux, convection, and radiation boundary conditions and internal heat generation. Additionally, TACO has some specialized features such as internal surface conditions (e.g., contact resistance), bulk nodes, enclosure radiation with view factor calculations, and chemical reactive kinetics. A user subprogram feature allows for any type of functional representation of any independent variable. A bandwidth and profile minimization option is also available in the code. Graphical representation of data generated by TACO is provided by a companion post-processor named POSTACO. The theory on which TACO is based is outlined, the capabilities of the code are explained, the input data required to perform an analysis with TACO are described. Some simple examples are provided to illustrate the use of the code.

  13. New opportunities seen for independents

    SciTech Connect

    Adams, G.A. )

    1990-10-22

    The collapse of gas and oil prices in the mid-1980s significantly reduced the number of independent exploration companies. At the same time, a fundamental shift occurred among major oil companies as they allocated their exploration budgets toward international operations and made major production purchases. Several large independents also embraced a philosophy of budget supplementation through joint venture partnership arrangements. This has created a unique and unusual window of opportunity for the smaller independents (defined for this article as exploration and production companies with a market value of less than $1 billion) to access the extensive and high quality domestic prospect inventories of the major and large independent oil and gas companies and to participate in the search for large reserve targets on attractive joint venture terms. Participation in these types of joint ventures, in conjunction with internally generated plays selected through the use of today's advanced technology (computer-enhanced, high-resolution seismic; horizontal drilling; etc.) and increasing process for oil and natural gas, presents the domestic exploration-oriented independent with an attractive money-making opportunity for the 1990s.

  14. An Eye-Tracking Study of How Color Coding Affects Multimedia Learning

    ERIC Educational Resources Information Center

    Ozcelik, Erol; Karakus, Turkan; Kursun, Engin; Cagiltay, Kursat

    2009-01-01

    Color coding has been proposed to promote more effective learning. However, insufficient evidence currently exists to show how color coding leads to better learning. The goal of this study was to investigate the underlying cause of the color coding effect by utilizing eye movement data. Fifty-two participants studied either a color-coded or…

  15. Allocentric coding: spatial range and combination rules.

    PubMed

    Camors, D; Jouffrais, C; Cottereau, B R; Durand, J B

    2015-04-01

    When a visual target is presented with neighboring landmarks, its location can be determined both relative to the self (egocentric coding) and relative to these landmarks (allocentric coding). In the present study, we investigated (1) how allocentric coding depends on the distance between the targets and their surrounding landmarks (i.e. the spatial range) and (2) how allocentric and egocentric coding interact with each other across targets-landmarks distances (i.e. the combination rules). Subjects performed a memory-based pointing task toward previously gazed targets briefly superimposed (200ms) on background images of cluttered city landscapes. A variable portion of the images was occluded in order to control the distance between the targets and the closest potential landmarks within those images. The pointing responses were performed after large saccades and the reappearance of the images at their initial location. However, in some trials, the images' elements were slightly shifted (±3°) in order to introduce a subliminal conflict between the allocentric and egocentric reference frames. The influence of allocentric coding in the pointing responses was found to decrease with increasing target-landmarks distances, although it remained significant even at the largest distances (⩾10°). Interestingly, both the decreasing influence of allocentric coding and the concomitant increase in pointing responses variability were well captured by a Bayesian model in which the weighted combination of allocentric and egocentric cues is governed by a coupling prior. PMID:25749676

  16. Channel coding for satellite mobile channels

    NASA Astrophysics Data System (ADS)

    Wong, K. H. H.; Hanzo, L.; Steele, R.

    1989-09-01

    The deployment of channel coding and interleaving to enhance the bit-error performance of a satellite mobile radio channel is addressed for speech and data transmissions. Different convolutional codes (CC) using Viterbi decoding with soft decision are examined with interblock interleaving. Reed-Solomon (RS) codes with Berlekamp-Massey hard decision decoding or soft decision trellis decoding combined with block interleaving are also investigated. A concatenated arrangement employing RS and CC coding as the outer and inner coders, respectively, is used for transmissions via minimum shift keying over Gaussian and Rayleigh fading channels. For an interblock interleaving period of 2880 bits, a concatenated arrangement of an RS(48,36), over the Galois field GF(256) and punctured PCC(3,1,7) yielding an overall coding rate of 1/2, provides a coding gain of 42dB for a BER of 10 to the -6th, and an uncorrectable error detection probability of 1 - 10 to the -9th.

  17. Multiphysics Code Demonstrated for Propulsion Applications

    NASA Technical Reports Server (NTRS)

    Lawrence, Charles; Melis, Matthew E.

    1998-01-01

    The utility of multidisciplinary analysis tools for aeropropulsion applications is being investigated at the NASA Lewis Research Center. The goal of this project is to apply Spectrum, a multiphysics code developed by Centric Engineering Systems, Inc., to simulate multidisciplinary effects in turbomachinery components. Many engineering problems today involve detailed computer analyses to predict the thermal, aerodynamic, and structural response of a mechanical system as it undergoes service loading. Analysis of aerospace structures generally requires attention in all three disciplinary areas to adequately predict component service behavior, and in many cases, the results from one discipline substantially affect the outcome of the other two. There are numerous computer codes currently available in the engineering community to perform such analyses in each of these disciplines. Many of these codes are developed and used in-house by a given organization, and many are commercially available. However, few, if any, of these codes are designed specifically for multidisciplinary analyses. The Spectrum code has been developed for performing fully coupled fluid, thermal, and structural analyses on a mechanical system with a single simulation that accounts for all simultaneous interactions, thus eliminating the requirement for running a large number of sequential, separate, disciplinary analyses. The Spectrum code has a true multiphysics analysis capability, which improves analysis efficiency as well as accuracy. Centric Engineering, Inc., working with a team of Lewis and AlliedSignal Engines engineers, has been evaluating Spectrum for a variety of propulsion applications including disk quenching, drum cavity flow, aeromechanical simulations, and a centrifugal compressor flow simulation.

  18. Cavity approach to the Sourlas code system.

    PubMed

    Huang, Haiping; Zhou, Haijun

    2009-11-01

    The statistical physics properties of regular and irregular Sourlas codes are investigated in this paper by the cavity method. At finite temperatures, the free-energy density of these coding systems is derived and compared with the result obtained by the replica method. In the zero-temperature limit, the Shannon's bound is recovered in the case of infinite-body interactions while the code rate is still finite. However, the decoding performance as obtained by the replica theory has not considered the zero-temperature entropic effect. The cavity approach is able to consider the ground-state entropy. It leads to a set of evanescent cavity fields propagation equations which further improve the decoding performance as confirmed by our numerical simulations on single instances. For the irregular Sourlas code, we find that it takes the trade-off between good dynamical property and high performance of decoding. In agreement with the results found from the algorithmic point of view, the decoding exhibits a first-order phase transition as occurs in the regular code system with three-body interactions. The cavity approach for the Sourlas code system can be extended to consider first-step replica symmetry breaking. PMID:20365049

  19. Cavity approach to the Sourlas code system

    NASA Astrophysics Data System (ADS)

    Huang, Haiping; Zhou, Haijun

    2009-11-01

    The statistical physics properties of regular and irregular Sourlas codes are investigated in this paper by the cavity method. At finite temperatures, the free-energy density of these coding systems is derived and compared with the result obtained by the replica method. In the zero-temperature limit, the Shannon’s bound is recovered in the case of infinite-body interactions while the code rate is still finite. However, the decoding performance as obtained by the replica theory has not considered the zero-temperature entropic effect. The cavity approach is able to consider the ground-state entropy. It leads to a set of evanescent cavity fields propagation equations which further improve the decoding performance as confirmed by our numerical simulations on single instances. For the irregular Sourlas code, we find that it takes the trade-off between good dynamical property and high performance of decoding. In agreement with the results found from the algorithmic point of view, the decoding exhibits a first-order phase transition as occurs in the regular code system with three-body interactions. The cavity approach for the Sourlas code system can be extended to consider first-step replica symmetry breaking.

  20. Binary coding for hyperspectral imagery

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Chang, Chein-I.; Chang, Chein-Chi; Lin, Chinsu

    2004-10-01

    Binary coding is one of simplest ways to characterize spectral features. One commonly used method is a binary coding-based image software system, called Spectral Analysis Manager (SPAM) for remotely sensed imagery developed by Mazer et al. For a given spectral signature, the SPAM calculates its spectral mean and inter-band spectral difference and uses them as thresholds to generate a binary code word for this particular spectral signature. Such coding scheme is generally effective and also very simple to implement. This paper revisits the SPAM and further develops three new SPAM-based binary coding methods, called equal probability partition (EPP) binary coding, halfway partition (HP) binary coding and median partition (MP) binary coding. These three binary coding methods along with the SPAM well be evaluated for spectral discrimination and identification. In doing so, a new criterion, called a posteriori discrimination probability (APDP) is also introduced for performance measure.

  1. The TESS (Tandem Experiment Simulation Studies) computer code user's manual

    SciTech Connect

    Procassini, R.J. . Dept. of Nuclear Engineering); Cohen, B.I. )

    1990-06-01

    TESS (Tandem Experiment Simulation Studies) is a one-dimensional, bounded particle-in-cell (PIC) simulation code designed to investigate the confinement and transport of plasma in a magnetic mirror device, including tandem mirror configurations. Mirror plasmas may be modeled in a system which includes an applied magnetic field and/or a self-consistent or applied electrostatic potential. The PIC code TESS is similar to the PIC code DIPSI (Direct Implicit Plasma Surface Interactions) which is designed to study plasma transport to and interaction with a solid surface. The codes TESS and DIPSI are direct descendants of the PIC code ES1 that was created by A. B. Langdon. This document provides the user with a brief description of the methods used in the code and a tutorial on the use of the code. 10 refs., 2 tabs.

  2. Quantum error-correcting codes over mixed alphabets

    NASA Astrophysics Data System (ADS)

    Wang, Zhuo; Yu, Sixia; Fan, Heng; Oh, C. H.

    2013-08-01

    We study the quantum error-correcting codes over mixed alphabets to deal with a more complicated and practical situation in which the physical systems for encoding may have different numbers of energy levels. In particular we investigate their constructions and propose the theory of quantum Singleton bound. Two kinds of code constructions are presented: a projection-based construction for general case and a graphical construction based on a graph-theoretical object composite coding clique dealing with the case of reducible alphabets. We find out some optimal one-error correcting or detecting codes over two alphabets. Our method of composite coding clique also sheds light on constructing standard quantum error-correcting codes, and other families of optimal codes are found.

  3. Sinusoidal transform coding

    NASA Technical Reports Server (NTRS)

    Mcaulay, Robert J.; Quatieri, Thomas F.

    1988-01-01

    It has been shown that an analysis/synthesis system based on a sinusoidal representation of speech leads to synthetic speech that is essentially perceptually indistinguishable from the original. Strategies for coding the amplitudes, frequencies and phases of the sine waves have been developed that have led to a multirate coder operating at rates from 2400 to 9600 bps. The encoded speech is highly intelligible at all rates with a uniformly improving quality as the data rate is increased. A real-time fixed-point implementation has been developed using two ADSP2100 DSP chips. The methods used for coding and quantizing the sine-wave parameters for operation at the various frame rates are described.

  4. Finite Element Analysis Code

    Energy Science and Technology Software Center (ESTSC)

    2006-03-08

    MAPVAR-KD is designed to transfer solution results from one finite element mesh to another. MAPVAR-KD draws heavily from the structure and coding of MERLIN II, but it employs a new finite element data base, EXODUS II, and offers enhanced speed and new capabilities not available in MERLIN II. In keeping with the MERLIN II documentation, the computational algorithms used in MAPVAR-KD are described. User instructions are presented. Example problems are included to demonstrate the operationmore » of the code and the effects of various input options. MAPVAR-KD is a modification of MAPVAR in which the search algorithm was replaced by a kd-tree-based search for better performance on large problems.« less

  5. Confocal coded aperture imaging

    DOEpatents

    Tobin, Jr., Kenneth William; Thomas, Jr., Clarence E.

    2001-01-01

    A method for imaging a target volume comprises the steps of: radiating a small bandwidth of energy toward the target volume; focusing the small bandwidth of energy into a beam; moving the target volume through a plurality of positions within the focused beam; collecting a beam of energy scattered from the target volume with a non-diffractive confocal coded aperture; generating a shadow image of said aperture from every point source of radiation in the target volume; and, reconstructing the shadow image into a 3-dimensional image of the every point source by mathematically correlating the shadow image with a digital or analog version of the coded aperture. The method can comprise the step of collecting the beam of energy scattered from the target volume with a Fresnel zone plate.

  6. CTI Correction Code

    NASA Astrophysics Data System (ADS)

    Massey, Richard; Stoughton, Chris; Leauthaud, Alexie; Rhodes, Jason; Koekemoer, Anton; Ellis, Richard; Shaghoulian, Edgar

    2013-07-01

    Charge Transfer Inefficiency (CTI) due to radiation damage above the Earth's atmosphere creates spurious trailing in images from Charge-Coupled Device (CCD) imaging detectors. Radiation damage also creates unrelated warm pixels, which can be used to measure CTI. This code provides pixel-based correction for CTI and has proven effective in Hubble Space Telescope Advanced Camera for Surveys raw images, successfully reducing the CTI trails by a factor of ~30 everywhere in the CCD and at all flux levels. The core is written in java for speed, and a front-end user interface is provided in IDL. The code operates on raw data by returning individual electrons to pixels from which they were unintentionally dragged during readout. Correction takes about 25 minutes per ACS exposure, but is trivially parallelisable to multiple processors.

  7. The Independent Technical Analysis Process

    SciTech Connect

    Duberstein, Corey A.; Ham, Kenneth D.; Dauble, Dennis D.; Johnson, Gary E.

    2007-04-13

    The Bonneville Power Administration (BPA) contracted with the Pacific Northwest National Laboratory (PNNL) to provide technical analytical support for system-wide fish passage information (BPA Project No. 2006-010-00). The goal of this project was to produce rigorous technical analysis products using independent analysts and anonymous peer reviewers. In the past, regional parties have interacted with a single entity, the Fish Passage Center to access the data, analyses, and coordination related to fish passage. This project provided an independent technical source for non-routine fish passage analyses while allowing routine support functions to be performed by other well-qualified entities.

  8. Status of MARS Code

    SciTech Connect

    N.V. Mokhov

    2003-04-09

    Status and recent developments of the MARS 14 Monte Carlo code system for simulation of hadronic and electromagnetic cascades in shielding, accelerator and detector components in the energy range from a fraction of an electronvolt up to 100 TeV are described. these include physics models both in strong and electromagnetic interaction sectors, variance reduction techniques, residual dose, geometry, tracking, histograming. MAD-MARS Beam Line Build and Graphical-User Interface.

  9. VAC: Versatile Advection Code

    NASA Astrophysics Data System (ADS)

    Tóth, Gábor; Keppens, Rony

    2012-07-01

    The Versatile Advection Code (VAC) is a freely available general hydrodynamic and magnetohydrodynamic simulation software that works in 1, 2 or 3 dimensions on Cartesian and logically Cartesian grids. VAC runs on any Unix/Linux system with a Fortran 90 (or 77) compiler and Perl interpreter. VAC can run on parallel machines using either the Message Passing Interface (MPI) library or a High Performance Fortran (HPF) compiler.

  10. Reeds computer code

    NASA Technical Reports Server (NTRS)

    Bjork, C.

    1981-01-01

    The REEDS (rocket exhaust effluent diffusion single layer) computer code is used for the estimation of certain rocket exhaust effluent concentrations and dosages and their distributions near the Earth's surface following a rocket launch event. Output from REEDS is used in producing near real time air quality and environmental assessments of the effects of certain potentially harmful effluents, namely HCl, Al2O3, CO, and NO.

  11. An experimental investigation of clocking effects on turbine aerodynamics using a modern 3-D one and one-half stage high pressure turbine for code verification and flow model development

    NASA Astrophysics Data System (ADS)

    Haldeman, Charles Waldo, IV

    2003-10-01

    This research uses a modern 1 and 1/2 stage high-pressure (HP) turbine operating at the proper design corrected speed, pressure ratio, and gas to metal temperature ratio to generate a detailed data set containing aerodynamic, heat-transfer and aero-performance information. The data was generated using the Ohio State University Gas Turbine Laboratory Turbine Test Facility (TTF), which is a short-duration shock tunnel facility. The research program utilizes an uncooled turbine stage for which all three airfoils are heavily instrumented at multiple spans and on the HPV and LPV endwalls and HPB platform and tips. Heat-flux and pressure data are obtained using the traditional shock-tube and blowdown facility operational modes. Detailed examination show that the aerodynamic (pressure) data obtained in the blowdown mode is the same as obtained in the shock-tube mode when the corrected conditions are matched. Various experimental conditions and configurations were performed, including LPV clocking positions, off-design corrected speed conditions, pressure ratio changes, and Reynolds number changes. The main research for this dissertation is concentrated on the LPV clocking experiments, where the LPV was clocked relative to the HPV at several different passage locations and at different Reynolds numbers. Various methods were used to evaluate the effect of clocking on both the aeroperformance (efficiency) and aerodynamics (pressure loading) on the LPV, including time-resolved measurements, time-averaged measurements and stage performance measurements. A general improvement in overall efficiency of approximately 2% is demonstrated and could be observed using a variety of independent methods. Maximum efficiency is obtained when the time-average pressures are highest on the LPV, and the time-resolved data both in the time domain and frequency domain show the least amount of variation. The gain in aeroperformance is obtained by integrating over the entire airfoil as the three

  12. MELCOR computer code manuals

    SciTech Connect

    Summers, R.M.; Cole, R.K. Jr.; Smith, R.C.; Stuart, D.S.; Thompson, S.L.; Hodge, S.A.; Hyman, C.R.; Sanders, R.L.

    1995-03-01

    MELCOR is a fully integrated, engineering-level computer code that models the progression of severe accidents in light water reactor nuclear power plants. MELCOR is being developed at Sandia National Laboratories for the U.S. Nuclear Regulatory Commission as a second-generation plant risk assessment tool and the successor to the Source Term Code Package. A broad spectrum of severe accident phenomena in both boiling and pressurized water reactors is treated in MELCOR in a unified framework. These include: thermal-hydraulic response in the reactor coolant system, reactor cavity, containment, and confinement buildings; core heatup, degradation, and relocation; core-concrete attack; hydrogen production, transport, and combustion; fission product release and transport; and the impact of engineered safety features on thermal-hydraulic and radionuclide behavior. Current uses of MELCOR include estimation of severe accident source terms and their sensitivities and uncertainties in a variety of applications. This publication of the MELCOR computer code manuals corresponds to MELCOR 1.8.3, released to users in August, 1994. Volume 1 contains a primer that describes MELCOR`s phenomenological scope, organization (by package), and documentation. The remainder of Volume 1 contains the MELCOR Users Guides, which provide the input instructions and guidelines for each package. Volume 2 contains the MELCOR Reference Manuals, which describe the phenomenological models that have been implemented in each package.

  13. Bar coded retroreflective target

    DOEpatents

    Vann, Charles S.

    2000-01-01

    This small, inexpensive, non-contact laser sensor can detect the location of a retroreflective target in a relatively large volume and up to six degrees of position. The tracker's laser beam is formed into a plane of light which is swept across the space of interest. When the beam illuminates the retroreflector, some of the light returns to the tracker. The intensity, angle, and time of the return beam is measured to calculate the three dimensional location of the target. With three retroreflectors on the target, the locations of three points on the target are measured, enabling the calculation of all six degrees of target position. Until now, devices for three-dimensional tracking of objects in a large volume have been heavy, large, and very expensive. Because of the simplicity and unique characteristics of this tracker, it is capable of three-dimensional tracking of one to several objects in a large volume, yet it is compact, light-weight, and relatively inexpensive. Alternatively, a tracker produces a diverging laser beam which is directed towards a fixed position, and senses when a retroreflective target enters the fixed field of view. An optically bar coded target can be read by the tracker to provide information about the target. The target can be formed of a ball lens with a bar code on one end. As the target moves through the field, the ball lens causes the laser beam to scan across the bar code.

  14. Preliminary Assessment of Turbomachinery Codes

    NASA Technical Reports Server (NTRS)

    Mazumder, Quamrul H.

    2007-01-01

    This report assesses different CFD codes developed and currently being used at Glenn Research Center to predict turbomachinery fluid flow and heat transfer behavior. This report will consider the following codes: APNASA, TURBO, GlennHT, H3D, and SWIFT. Each code will be described separately in the following section with their current modeling capabilities, level of validation, pre/post processing, and future development and validation requirements. This report addresses only previously published and validations of the codes. However, the codes have been further developed to extend the capabilities of the codes.

  15. Structural coding versus free-energy predictive coding.

    PubMed

    van der Helm, Peter A

    2016-06-01

    Focusing on visual perceptual organization, this article contrasts the free-energy (FE) version of predictive coding (a recent Bayesian approach) to structural coding (a long-standing representational approach). Both use free-energy minimization as metaphor for processing in the brain, but their formal elaborations of this metaphor are fundamentally different. FE predictive coding formalizes it by minimization of prediction errors, whereas structural coding formalizes it by minimization of the descriptive complexity of predictions. Here, both sides are evaluated. A conclusion regarding competence is that FE predictive coding uses a powerful modeling technique, but that structural coding has more explanatory power. A conclusion regarding performance is that FE predictive coding-though more detailed in its account of neurophysiological data-provides a less compelling cognitive architecture than that of structural coding, which, for instance, supplies formal support for the computationally powerful role it attributes to neuronal synchronization. PMID:26407895

  16. COLD-SAT Dynamic Model Computer Code

    NASA Technical Reports Server (NTRS)

    Bollenbacher, G.; Adams, N. S.

    1995-01-01

    COLD-SAT Dynamic Model (CSDM) computer code implements six-degree-of-freedom, rigid-body mathematical model for simulation of spacecraft in orbit around Earth. Investigates flow dynamics and thermodynamics of subcritical cryogenic fluids in microgravity. Consists of three parts: translation model, rotation model, and slosh model. Written in FORTRAN 77.

  17. Convolutional coding techniques for data protection

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1975-01-01

    Results of research on the use of convolutional codes in data communications are presented. Convolutional coding fundamentals are discussed along with modulation and coding interaction. Concatenated coding systems and data compression with convolutional codes are described.

  18. Neural Coding for Effective Rehabilitation

    PubMed Central

    2014-01-01

    Successful neurological rehabilitation depends on accurate diagnosis, effective treatment, and quantitative evaluation. Neural coding, a technology for interpretation of functional and structural information of the nervous system, has contributed to the advancements in neuroimaging, brain-machine interface (BMI), and design of training devices for rehabilitation purposes. In this review, we summarized the latest breakthroughs in neuroimaging from microscale to macroscale levels with potential diagnostic applications for rehabilitation. We also reviewed the achievements in electrocorticography (ECoG) coding with both animal models and human beings for BMI design, electromyography (EMG) interpretation for interaction with external robotic systems, and robot-assisted quantitative evaluation on the progress of rehabilitation programs. Future rehabilitation would be more home-based, automatic, and self-served by patients. Further investigations and breakthroughs are mainly needed in aspects of improving the computational efficiency in neuroimaging and multichannel ECoG by selection of localized neuroinformatics, validation of the effectiveness in BMI guided rehabilitation programs, and simplification of the system operation in training devices. PMID:25258708

  19. Development of the Code RITRACKS

    NASA Technical Reports Server (NTRS)

    Plante, Ianik; Cucinotta, Francis A.

    2013-01-01

    A document discusses the code RITRACKS (Relativistic Ion Tracks), which was developed to simulate heavy ion track structure at the microscopic and nanoscopic scales. It is a Monte-Carlo code that simulates the production of radiolytic species in water, event-by-event, and which may be used to simulate tracks and also to calculate dose in targets and voxels of different sizes. The dose deposited by the radiation can be calculated in nanovolumes (voxels). RITRACKS allows simulation of radiation tracks without the need of extensive knowledge of computer programming or Monte-Carlo simulations. It is installed as a regular application on Windows systems. The main input parameters entered by the user are the type and energy of the ion, the length and size of the irradiated volume, the number of ions impacting the volume, and the number of histories. The simulation can be started after the input parameters are entered in the GUI. The number of each kind of interactions for each track is shown in the result details window. The tracks can be visualized in 3D after the simulation is complete. It is also possible to see the time evolution of the tracks and zoom on specific parts of the tracks. The software RITRACKS can be very useful for radiation scientists to investigate various problems in the fields of radiation physics, radiation chemistry, and radiation biology. For example, it can be used to simulate electron ejection experiments (radiation physics).

  20. Performance improvement of spectral amplitude coding-optical code division multiple access systems using NAND detection with enhanced double weight code

    NASA Astrophysics Data System (ADS)

    Ahmed, Nasim; Aljunid, Syed Alwee; Ahmad, R. Badlishah; Fadhil, Hilal A.; Rashid, Mohd Abdur

    2012-01-01

    The bit-error rate (BER) performance of the spectral amplitude coding-optical code division multiple access (SACOCDMA) system has been investigated by using NAND subtraction detection technique with enhanced double weight (EDW) code. The EDW code is the enhanced version of double weight (DW) code family where the code weight is any odd number and greater than one with ideal cross-correlation. In order to evaluate the performance of the system, we used mathematical analysis extensively along with the simulation experiment. The evaluation results obtained using the NAND subtraction detection technique was compared with those obtained using the complementary detection technique for the same number of active users. The comparison results revealed that the BER performance of the system using NAND subtraction detection technique has greatly been improved as compared to the complementary technique.

  1. The APS SASE FEL : modeling and code comparison.

    SciTech Connect

    Biedron, S. G.

    1999-04-20

    A self-amplified spontaneous emission (SASE) free-electron laser (FEL) is under construction at the Advanced Photon Source (APS). Five FEL simulation codes were used in the design phase: GENESIS, GINGER, MEDUSA, RON, and TDA3D. Initial comparisons between each of these independent formulations show good agreement for the parameters of the APS SASE FEL.

  2. Combinatorial neural codes from a mathematical coding theory perspective.

    PubMed

    Curto, Carina; Itskov, Vladimir; Morrison, Katherine; Roth, Zachary; Walker, Judy L

    2013-07-01

    Shannon's seminal 1948 work gave rise to two distinct areas of research: information theory and mathematical coding theory. While information theory has had a strong influence on theoretical neuroscience, ideas from mathematical coding theory have received considerably less attention. Here we take a new look at combinatorial neural codes from a mathematical coding theory perspective, examining the error correction capabilities of familiar receptive field codes (RF codes). We find, perhaps surprisingly, that the high levels of redundancy present in these codes do not support accurate error correction, although the error-correcting performance of receptive field codes catches up to that of random comparison codes when a small tolerance to error is introduced. However, receptive field codes are good at reflecting distances between represented stimuli, while the random comparison codes are not. We suggest that a compromise in error-correcting capability may be a necessary price to pay for a neural code whose structure serves not only error correction, but must also reflect relationships between stimuli. PMID:23724797

  3. Field Independence: Reviewing the Evidence

    ERIC Educational Resources Information Center

    Evans, Carol; Richardson, John T. E.; Waring, Michael

    2013-01-01

    Background: The construct of ?eld independence (FI) remains one of the most widely cited notions in research on cognitive style and on learning and instruction more generally. However, a great deal of confusion continues to exist around the de?nition of FI, its measurement, and the interpretation of research results, all of which have served to…

  4. Independent Study Project, Topic: Topology.

    ERIC Educational Resources Information Center

    Notre Dame High School, Easton, PA.

    Using this guide and the four popular books noted in it, a student, working independently, will learn about some of the classical ideas and problems of topology: the Meobius strip and Klein bottle, the four color problem, genus of a surface, networks, Euler's formula, and the Jordan Curve Theorem. The unit culminates in a project of the students'…

  5. Boston: Cradle of American Independence

    ERIC Educational Resources Information Center

    Community College Journal, 2004

    2004-01-01

    The 2005 American Association of Community Colleges Annual Convention will be held April 6-9 in Boston. While thoroughly modern, the iconic city's identity is firmly rooted in the past. As the cradle of American independence, Boston's long history is an integral part of the American fabric. Adams, Revere, Hancock are more than historical figures;…

  6. Independent Study Course Development Costs.

    ERIC Educational Resources Information Center

    Wright, Clayton R.

    1988-01-01

    Discusses actual costs for developing independent study print courses for use in learning centers or for distance delivery, and presents a resource allocation guideline based on figures from Grant MacEwan Community College (Alberta). Topics discussed include the course writer/developer, clerical support, copyright clearance, instructional design,…

  7. Haptic Tracking Permits Bimanual Independence

    ERIC Educational Resources Information Center

    Rosenbaum, David A.; Dawson, Amanda A.; Challis, John H.

    2006-01-01

    This study shows that in a novel task--bimanual haptic tracking--neurologically normal human adults can move their 2 hands independently for extended periods of time with little or no training. Participants lightly touched buttons whose positions were moved either quasi-randomly in the horizontal plane by 1 or 2 human drivers (Experiment 1), in…

  8. 10 Questions about Independent Reading

    ERIC Educational Resources Information Center

    Truby, Dana

    2012-01-01

    Teachers know that establishing a robust independent reading program takes more than giving kids a little quiet time after lunch. But how do they set up a program that will maximize their students' gains? Teachers have to know their students' reading levels inside and out, help them find just-right books, and continue to guide them during…

  9. Instructional Materials in Independent Living.

    ERIC Educational Resources Information Center

    Smith, Bradley C.; Fry, Ronald R.

    This annotated list of 103 instructional materials for use in an independent living program focused on personal, social, and community adjustment of those with special needs is cross referenced using a subject index that lists skill areas within a fourteen-category system. Document descriptions are arranged alphabetically by author and include…

  10. On decoding of multi-level MPSK modulation codes

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Gupta, Alok Kumar

    1990-01-01

    The decoding problem of multi-level block modulation codes is investigated. The hardware design of soft-decision Viterbi decoder for some short length 8-PSK block modulation codes is presented. An effective way to reduce the hardware complexity of the decoder by reducing the branch metric and path metric, using a non-uniform floating-point to integer mapping scheme, is proposed and discussed. The simulation results of the design are presented. The multi-stage decoding (MSD) of multi-level modulation codes is also investigated. The cases of soft-decision and hard-decision MSD are considered and their performance are evaluated for several codes of different lengths and different minimum squared Euclidean distances. It is shown that the soft-decision MSD reduces the decoding complexity drastically and it is suboptimum. The hard-decision MSD further simplifies the decoding while still maintaining a reasonable coding gain over the uncoded system, if the component codes are chosen properly. Finally, some basic 3-level 8-PSK modulation codes using BCH codes as component codes are constructed and their coding gains are found for hard decision multistage decoding.

  11. Reliability of ICD-10 external cause of death codes in the National Coroners Information System.

    PubMed

    Bugeja, Lyndal; Clapperton, Angela J; Killian, Jessica J; Stephan, Karen L; Ozanne-Smith, Joan

    2010-01-01

    Availability of ICD-10 cause of death codes in the National Coroners Information System (NCIS) strengthens its value as a public health surveillance tool. This study quantified the completeness of external cause ICD-10 codes in the NCIS for Victorian deaths (as assigned by the Australian Bureau of Statistics (ABS) in the yearly Cause of Death data). It also examined the concordance between external cause ICD-10 codes contained in the NCIS and a re-code of the same deaths conducted by an independent coder. Of 7,400 NCIS external cause deaths included in this study, 961 (13.0%) did not contain an ABS assigned ICD-10 code and 225 (3.0%) contained only a natural cause code. Where an ABS assigned external cause ICD-10 code was present (n=6,214), 4,397 (70.8%) matched exactly with the independently assigned ICD-10 code. Coding disparity primarily related to differences in assignment of intent and specificity. However, in a small number of deaths (n=49, 0.8%) there was coding disparity for both intent and external cause category. NCIS users should be aware of the limitations of relying only on ICD-10 codes contained within the NCIS for deaths prior to 2007 and consider using these in combination with the other NCIS data fields and code sets to ensure optimum case identification. PMID:21041843

  12. Qudit color codes and gauge color codes in all spatial dimensions

    NASA Astrophysics Data System (ADS)

    Watson, Fern H. E.; Campbell, Earl T.; Anwar, Hussain; Browne, Dan E.

    2015-08-01

    Two-level quantum systems, qubits, are not the only basis for quantum computation. Advantages exist in using qudits, d -level quantum systems, as the basic carrier of quantum information. We show that color codes, a class of topological quantum codes with remarkable transversality properties, can be generalized to the qudit paradigm. In recent developments it was found that in three spatial dimensions a qubit color code can support a transversal non-Clifford gate and that in higher spatial dimensions additional non-Clifford gates can be found, saturating Bravyi and König's bound [S. Bravyi and R. König, Phys. Rev. Lett. 111, 170502 (2013), 10.1103/PhysRevLett.111.170502]. Furthermore, by using gauge fixing techniques, an effective set of Clifford gates can be achieved, removing the need for state distillation. We show that the qudit color code can support the qudit analogs of these gates and also show that in higher spatial dimensions a color code can support a phase gate from higher levels of the Clifford hierarchy that can be proven to saturate Bravyi and König's bound in all but a finite number of special cases. The methodology used is a generalization of Bravyi and Haah's method of triorthogonal matrices [S. Bravyi and J. Haah, Phys. Rev. A 86, 052329 (2012), 10.1103/PhysRevA.86.052329], which may be of independent interest. For completeness, we show explicitly that the qudit color codes generalize to gauge color codes and share many of the favorable properties of their qubit counterparts.

  13. On lossless coding for HEVC

    NASA Astrophysics Data System (ADS)

    Gao, Wen; Jiang, Minqiang; Yu, Haoping

    2013-02-01

    In this paper, we first review the lossless coding mode in the version 1 of the HEVC standard that has recently finalized. We then provide a performance comparison between the lossless coding mode in the HEVC and MPEG-AVC/H.264 standards and show that the HEVC lossless coding has limited coding efficiency. To improve the performance of the lossless coding mode, several new coding tools that were contributed to JCT-VC but not adopted in version 1 of HEVC standard are introduced. In particular, we discuss sample based intra prediction and coding of residual coefficients in more detail. At the end, we briefly address a new class of coding tools, i.e., a dictionary-based coder, that is efficient in encoding screen content including graphics and text.

  14. Distributed Turbo Product Codes with Multiple Vertical Parities

    NASA Astrophysics Data System (ADS)

    Obiedat, Esam A.; Chen, Guotai; Cao, Lei

    2009-12-01

    We propose a Multiple Vertical Parities Distributed Turbo Product Code (MVP-DTPC) over cooperative network using block Bose Chaudhuri Hochquenghem (BCH) codes as component codes. The source broadcasts extended BCH coded frames to the destination and nearby relays. After decoding the received sequences, each relay constructs a product code by arranging the corrected bit sequences in rows and re-encoding them vertically using BCH as component codes to obtain an Incremental Redundancy (IR) for source's data. To obtain independent vertical parities from each relay in the same code space, we propose a new Circular Interleaver for source's data; different circular interleavers are used to interleave BCH rows before re-encoding vertically. The Maximum A posteriori Probability (MAP) decoding is achieved by applying maximum transfer of extrinsic information between the multiple decoding stages. This is employed in the modified turbo product decoder, which is proposed to cope with multiple parities. The a posteriori output from a vertical decoding stage is used to derive the soft extrinsic information, that are used as a priori input for the next horizontal decoding stage. Simulation results in Additive White Gaussian Noise (AWGN) channel using network scenarios show 0.3-0.5 dB gain improvement in Bit Error Rate (BER) performance over the non-cooperative Turbo Product Codes (TPC).

  15. Progress in cultivation-independent phyllosphere microbiology.

    PubMed

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their 'noncultivability' and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human-pathogenic bacteria in plant food. PMID:24003903

  16. Progress in cultivation-independent phyllosphere microbiology

    PubMed Central

    Müller, Thomas; Ruppel, Silke

    2014-01-01

    Most microorganisms of the phyllosphere are nonculturable in commonly used media and culture conditions, as are those in other natural environments. This review queries the reasons for their ‘noncultivability’ and assesses developments in phyllospere microbiology that have been achieved cultivation independently over the last 4 years. Analyses of total microbial communities have revealed a comprehensive microbial diversity. 16S rRNA gene amplicon sequencing and metagenomic sequencing were applied to investigate plant species, location and season as variables affecting the composition of these communities. In continuation to culture-based enzymatic and metabolic studies with individual isolates, metaproteogenomic approaches reveal a great potential to study the physiology of microbial communities in situ. Culture-independent microbiological technologies as well advances in plant genetics and biochemistry provide methodological preconditions for exploring the interactions between plants and their microbiome in the phyllosphere. Improving and combining cultivation and culture-independent techniques can contribute to a better understanding of the phyllosphere ecology. This is essential, for example, to avoid human–pathogenic bacteria in plant food. PMID:24003903

  17. Noiseless Coding Of Magnetometer Signals

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Lee, Jun-Ji

    1989-01-01

    Report discusses application of noiseless data-compression coding to digitized readings of spaceborne magnetometers for transmission back to Earth. Objective of such coding to increase efficiency by decreasing rate of transmission without sacrificing integrity of data. Adaptive coding compresses data by factors ranging from 2 to 6.

  18. Energy Codes and Standards: Facilities

    SciTech Connect

    Bartlett, Rosemarie; Halverson, Mark A.; Shankle, Diana L.

    2007-01-01

    Energy codes and standards play a vital role in the marketplace by setting minimum requirements for energy-efficient design and construction. They outline uniform requirements for new buildings as well as additions and renovations. This article covers basic knowledge of codes and standards; development processes of each; adoption, implementation, and enforcement of energy codes and standards; and voluntary energy efficiency programs.

  19. Coding Issues in Grounded Theory

    ERIC Educational Resources Information Center

    Moghaddam, Alireza

    2006-01-01

    This paper discusses grounded theory as one of the qualitative research designs. It describes how grounded theory generates from data. Three phases of grounded theory--open coding, axial coding, and selective coding--are discussed, along with some of the issues which are the source of debate among grounded theorists, especially between its…

  20. Authorship Attribution of Source Code

    ERIC Educational Resources Information Center

    Tennyson, Matthew F.

    2013-01-01

    Authorship attribution of source code is the task of deciding who wrote a program, given its source code. Applications include software forensics, plagiarism detection, and determining software ownership. A number of methods for the authorship attribution of source code have been presented in the past. A review of those existing methods is…